2026-03-08T22:36:12.437 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-08T22:36:12.440 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-08T22:36:12.457 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps/276 branch: squid description: rados:standalone/{supported-random-distro$/{centos_latest} workloads/erasure-code} email: null first_in_suite: false flavor: default job_id: '276' last_in_suite: false machine_type: vps name: kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps no_nested_subset: false openstack: - volumes: count: 3 size: 10 os_type: centos os_version: 9.stream overrides: admin_socket: branch: squid ansible.cephlab: branch: main skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: timezone: UTC ceph: conf: mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 osd: debug ms: 1 debug osd: 20 osd mclock iops capacity threshold hdd: 49000 flavor: default log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) sha1: e911bdebe5c8faa3800735d1568fcdca65db60df ceph-deploy: conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: {} install: ceph: flavor: default sha1: e911bdebe5c8faa3800735d1568fcdca65db60df extra_system_packages: deb: - python3-xmltodict - python3-jmespath rpm: - bzip2 - perl-Test-Harness - python3-xmltodict - python3-jmespath selinux: allowlist: - scontext=system_u:system_r:getty_t:s0 workunit: branch: tt-squid sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - mon.a - mgr.x - osd.0 - osd.1 - osd.2 - client.0 seed: 5909 sha1: e911bdebe5c8faa3800735d1568fcdca65db60df sleep_before_teardown: 0 suite: rados:standalone suite_branch: tt-squid suite_path: /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 targets: vm04.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKNKVGgDfdVvSebF1Z7SLTKtLYgrm95xrPDlRRMkaKJZbzIlNutEzXG05p+BMFp8gBtaVnNibHYVfxO+I+lW518= tasks: - install: null - workunit: basedir: qa/standalone clients: all: - erasure-code teuthology: fragments_dropped: [] meta: {} postmerge: [] teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-08_21:49:43 tube: vps user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.611473 2026-03-08T22:36:12.457 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa; will attempt to use it 2026-03-08T22:36:12.458 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks 2026-03-08T22:36:12.458 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-08T22:36:12.458 INFO:teuthology.task.internal:Checking packages... 2026-03-08T22:36:12.458 INFO:teuthology.task.internal:Checking packages for os_type 'centos', flavor 'default' and ceph hash 'e911bdebe5c8faa3800735d1568fcdca65db60df' 2026-03-08T22:36:12.458 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-08T22:36:12.458 INFO:teuthology.packaging:ref: None 2026-03-08T22:36:12.458 INFO:teuthology.packaging:tag: None 2026-03-08T22:36:12.458 INFO:teuthology.packaging:branch: squid 2026-03-08T22:36:12.458 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T22:36:12.459 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=squid 2026-03-08T22:36:13.252 INFO:teuthology.task.internal:Found packages for ceph version 19.2.3-678.ge911bdeb 2026-03-08T22:36:13.253 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-08T22:36:13.253 INFO:teuthology.task.internal:no buildpackages task found 2026-03-08T22:36:13.253 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-08T22:36:13.254 INFO:teuthology.task.internal:Saving configuration 2026-03-08T22:36:13.257 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-08T22:36:13.258 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-08T22:36:13.265 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm04.local', 'description': '/archive/kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps/276', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-08 22:35:36.200944', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:04', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKNKVGgDfdVvSebF1Z7SLTKtLYgrm95xrPDlRRMkaKJZbzIlNutEzXG05p+BMFp8gBtaVnNibHYVfxO+I+lW518='} 2026-03-08T22:36:13.265 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-08T22:36:13.266 INFO:teuthology.task.internal:roles: ubuntu@vm04.local - ['mon.a', 'mgr.x', 'osd.0', 'osd.1', 'osd.2', 'client.0'] 2026-03-08T22:36:13.266 INFO:teuthology.run_tasks:Running task console_log... 2026-03-08T22:36:13.272 DEBUG:teuthology.task.console_log:vm04 does not support IPMI; excluding 2026-03-08T22:36:13.273 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7f1799c08280>, signals=[15]) 2026-03-08T22:36:13.273 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-08T22:36:13.273 INFO:teuthology.task.internal:Opening connections... 2026-03-08T22:36:13.273 DEBUG:teuthology.task.internal:connecting to ubuntu@vm04.local 2026-03-08T22:36:13.274 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm04.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-08T22:36:13.332 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-08T22:36:13.333 DEBUG:teuthology.orchestra.run.vm04:> uname -m 2026-03-08T22:36:13.484 INFO:teuthology.orchestra.run.vm04.stdout:x86_64 2026-03-08T22:36:13.484 DEBUG:teuthology.orchestra.run.vm04:> cat /etc/os-release 2026-03-08T22:36:13.539 INFO:teuthology.orchestra.run.vm04.stdout:NAME="CentOS Stream" 2026-03-08T22:36:13.539 INFO:teuthology.orchestra.run.vm04.stdout:VERSION="9" 2026-03-08T22:36:13.540 INFO:teuthology.orchestra.run.vm04.stdout:ID="centos" 2026-03-08T22:36:13.540 INFO:teuthology.orchestra.run.vm04.stdout:ID_LIKE="rhel fedora" 2026-03-08T22:36:13.540 INFO:teuthology.orchestra.run.vm04.stdout:VERSION_ID="9" 2026-03-08T22:36:13.540 INFO:teuthology.orchestra.run.vm04.stdout:PLATFORM_ID="platform:el9" 2026-03-08T22:36:13.540 INFO:teuthology.orchestra.run.vm04.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-08T22:36:13.540 INFO:teuthology.orchestra.run.vm04.stdout:ANSI_COLOR="0;31" 2026-03-08T22:36:13.540 INFO:teuthology.orchestra.run.vm04.stdout:LOGO="fedora-logo-icon" 2026-03-08T22:36:13.540 INFO:teuthology.orchestra.run.vm04.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-08T22:36:13.540 INFO:teuthology.orchestra.run.vm04.stdout:HOME_URL="https://centos.org/" 2026-03-08T22:36:13.540 INFO:teuthology.orchestra.run.vm04.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-08T22:36:13.540 INFO:teuthology.orchestra.run.vm04.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-08T22:36:13.540 INFO:teuthology.orchestra.run.vm04.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-08T22:36:13.540 INFO:teuthology.lock.ops:Updating vm04.local on lock server 2026-03-08T22:36:13.544 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-08T22:36:13.546 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-08T22:36:13.547 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-08T22:36:13.547 DEBUG:teuthology.orchestra.run.vm04:> test '!' -e /home/ubuntu/cephtest 2026-03-08T22:36:13.594 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-08T22:36:13.595 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-08T22:36:13.595 DEBUG:teuthology.orchestra.run.vm04:> test -z $(ls -A /var/lib/ceph) 2026-03-08T22:36:13.649 INFO:teuthology.orchestra.run.vm04.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-08T22:36:13.658 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-08T22:36:13.666 DEBUG:teuthology.orchestra.run.vm04:> test -e /ceph-qa-ready 2026-03-08T22:36:13.712 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:36:13.892 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-08T22:36:13.894 INFO:teuthology.task.internal:Creating test directory... 2026-03-08T22:36:13.894 DEBUG:teuthology.orchestra.run.vm04:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-08T22:36:13.909 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-08T22:36:13.910 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-08T22:36:13.911 INFO:teuthology.task.internal:Creating archive directory... 2026-03-08T22:36:13.911 DEBUG:teuthology.orchestra.run.vm04:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-08T22:36:13.967 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-08T22:36:13.968 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-08T22:36:13.968 DEBUG:teuthology.orchestra.run.vm04:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-08T22:36:14.020 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:36:14.021 DEBUG:teuthology.orchestra.run.vm04:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-08T22:36:14.085 INFO:teuthology.orchestra.run.vm04.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:36:14.094 INFO:teuthology.orchestra.run.vm04.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:36:14.095 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-08T22:36:14.098 INFO:teuthology.task.internal:Configuring sudo... 2026-03-08T22:36:14.098 DEBUG:teuthology.orchestra.run.vm04:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-08T22:36:14.162 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-08T22:36:14.165 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-08T22:36:14.165 DEBUG:teuthology.orchestra.run.vm04:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-08T22:36:14.227 DEBUG:teuthology.orchestra.run.vm04:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-08T22:36:14.293 DEBUG:teuthology.orchestra.run.vm04:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-08T22:36:14.350 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-08T22:36:14.350 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-08T22:36:14.415 DEBUG:teuthology.orchestra.run.vm04:> sudo service rsyslog restart 2026-03-08T22:36:14.487 INFO:teuthology.orchestra.run.vm04.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-08T22:36:14.939 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-08T22:36:14.941 INFO:teuthology.task.internal:Starting timer... 2026-03-08T22:36:14.941 INFO:teuthology.run_tasks:Running task pcp... 2026-03-08T22:36:14.944 INFO:teuthology.run_tasks:Running task selinux... 2026-03-08T22:36:14.946 DEBUG:teuthology.task:Applying overrides for task selinux: {'allowlist': ['scontext=system_u:system_r:getty_t:s0']} 2026-03-08T22:36:14.946 INFO:teuthology.task.selinux:Excluding vm04: VMs are not yet supported 2026-03-08T22:36:14.946 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-08T22:36:14.946 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-08T22:36:14.946 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-08T22:36:14.946 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-08T22:36:14.947 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'timezone': 'UTC'}} 2026-03-08T22:36:14.948 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main to origin/main 2026-03-08T22:36:14.953 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-08T22:36:14.953 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "timezone": "UTC"}' -i /tmp/teuth_ansible_inventorynmd3s1tj --limit vm04.local /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-08T22:38:04.650 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm04.local')] 2026-03-08T22:38:04.650 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm04.local' 2026-03-08T22:38:04.651 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm04.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-08T22:38:04.718 DEBUG:teuthology.orchestra.run.vm04:> true 2026-03-08T22:38:04.797 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm04.local' 2026-03-08T22:38:04.797 INFO:teuthology.run_tasks:Running task clock... 2026-03-08T22:38:04.800 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-08T22:38:04.800 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-08T22:38:04.800 DEBUG:teuthology.orchestra.run.vm04:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-08T22:38:04.879 INFO:teuthology.orchestra.run.vm04.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-08T22:38:04.899 INFO:teuthology.orchestra.run.vm04.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-08T22:38:04.925 INFO:teuthology.orchestra.run.vm04.stderr:sudo: ntpd: command not found 2026-03-08T22:38:04.942 INFO:teuthology.orchestra.run.vm04.stdout:506 Cannot talk to daemon 2026-03-08T22:38:04.960 INFO:teuthology.orchestra.run.vm04.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-08T22:38:04.977 INFO:teuthology.orchestra.run.vm04.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-08T22:38:05.024 INFO:teuthology.orchestra.run.vm04.stderr:bash: line 1: ntpq: command not found 2026-03-08T22:38:05.025 INFO:teuthology.orchestra.run.vm04.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-08T22:38:05.025 INFO:teuthology.orchestra.run.vm04.stdout:=============================================================================== 2026-03-08T22:38:05.027 INFO:teuthology.run_tasks:Running task install... 2026-03-08T22:38:05.029 DEBUG:teuthology.task.install:project ceph 2026-03-08T22:38:05.029 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'}, 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-08T22:38:05.029 DEBUG:teuthology.task.install:config {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-08T22:38:05.029 INFO:teuthology.task.install:Using flavor: default 2026-03-08T22:38:05.031 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'ceph-volume', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'ceph-volume', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-08T22:38:05.031 INFO:teuthology.task.install:extra packages: [] 2026-03-08T22:38:05.032 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': None, 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': [], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': None, 'wait_for_package': False} 2026-03-08T22:38:05.032 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T22:38:05.631 INFO:teuthology.task.install.rpm:Pulling from https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/ 2026-03-08T22:38:05.631 INFO:teuthology.task.install.rpm:Package version is 19.2.3-678.ge911bdeb 2026-03-08T22:38:06.119 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-08T22:38:06.119 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-08T22:38:06.119 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-08T22:38:06.161 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, ceph-volume, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-08T22:38:06.162 DEBUG:teuthology.orchestra.run.vm04:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;sha1/e911bdebe5c8faa3800735d1568fcdca65db60df/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-08T22:38:06.236 DEBUG:teuthology.orchestra.run.vm04:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-08T22:38:06.327 DEBUG:teuthology.orchestra.run.vm04:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-08T22:38:06.357 INFO:teuthology.orchestra.run.vm04.stdout:check_obsoletes = 1 2026-03-08T22:38:06.359 DEBUG:teuthology.orchestra.run.vm04:> sudo yum clean all 2026-03-08T22:38:06.553 INFO:teuthology.orchestra.run.vm04.stdout:41 files removed 2026-03-08T22:38:06.574 DEBUG:teuthology.orchestra.run.vm04:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse ceph-volume librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-08T22:38:07.962 INFO:teuthology.orchestra.run.vm04.stdout:ceph packages for x86_64 71 kB/s | 84 kB 00:01 2026-03-08T22:38:08.965 INFO:teuthology.orchestra.run.vm04.stdout:ceph noarch packages 12 kB/s | 12 kB 00:00 2026-03-08T22:38:09.925 INFO:teuthology.orchestra.run.vm04.stdout:ceph source packages 2.0 kB/s | 1.9 kB 00:00 2026-03-08T22:38:11.219 INFO:teuthology.orchestra.run.vm04.stdout:CentOS Stream 9 - BaseOS 7.0 MB/s | 8.9 MB 00:01 2026-03-08T22:38:13.178 INFO:teuthology.orchestra.run.vm04.stdout:CentOS Stream 9 - AppStream 24 MB/s | 27 MB 00:01 2026-03-08T22:38:17.616 INFO:teuthology.orchestra.run.vm04.stdout:CentOS Stream 9 - CRB 7.0 MB/s | 8.0 MB 00:01 2026-03-08T22:38:19.380 INFO:teuthology.orchestra.run.vm04.stdout:CentOS Stream 9 - Extras packages 30 kB/s | 20 kB 00:00 2026-03-08T22:38:19.888 INFO:teuthology.orchestra.run.vm04.stdout:Extra Packages for Enterprise Linux 50 MB/s | 20 MB 00:00 2026-03-08T22:38:25.475 INFO:teuthology.orchestra.run.vm04.stdout:lab-extras 65 kB/s | 50 kB 00:00 2026-03-08T22:38:27.282 INFO:teuthology.orchestra.run.vm04.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-08T22:38:27.283 INFO:teuthology.orchestra.run.vm04.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-08T22:38:27.291 INFO:teuthology.orchestra.run.vm04.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-08T22:38:27.292 INFO:teuthology.orchestra.run.vm04.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-08T22:38:27.341 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout:====================================================================================== 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout:====================================================================================== 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout:Installing: 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 6.5 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph-base x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 5.5 M 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph-fuse x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 1.2 M 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph-immutable-object-cache x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 145 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 1.1 M 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-cephadm noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 150 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-dashboard noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 3.8 M 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-diskprediction-local noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 7.4 M 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-rook noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 49 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph-radosgw x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 11 M 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph-test x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 50 M 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph-volume noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 299 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: cephadm noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 769 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-devel x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 34 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs2 x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 1.0 M 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: librados-devel x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 127 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: python3-cephfs x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 165 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: python3-rados x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 323 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: python3-rbd x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 303 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: python3-rgw x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 100 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: rbd-fuse x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 85 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: rbd-mirror x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 3.1 M 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: rbd-nbd x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 171 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout:Upgrading: 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: librados2 x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 3.4 M 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: librbd1 x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 3.2 M 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout:Installing dependencies: 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: abseil-cpp x86_64 20211102.0-4.el9 epel 551 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph-common x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 22 M 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph-grafana-dashboards noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 31 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mds x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 2.4 M 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 253 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mon x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 4.7 M 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph-osd x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 17 M 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph-prometheus-alerts noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 17 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph-selinux x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 25 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: cryptsetup x86_64 2.8.1-3.el9 baseos 351 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-08T22:38:27.349 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: grpc-data noarch 1.46.7-10.el9 epel 19 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: libcephsqlite x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 163 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: libnbd x86_64 1.20.3-4.el9 appstream 164 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: libradosstriper1 x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 503 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: librgw2 x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 5.4 M 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: lua x86_64 5.4.4-4.el9 appstream 188 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: lua-devel x86_64 5.4.4-4.el9 crb 22 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: luarocks noarch 3.9.2-5.el9 epel 151 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: pciutils x86_64 3.7.0-7.el9 baseos 93 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: protobuf x86_64 3.14.0-17.el9 appstream 1.0 M 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: protobuf-compiler x86_64 3.14.0-17.el9 crb 862 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 45 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-common x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 142 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: python3-grpcio x86_64 1.46.7-10.el9 epel 2.0 M 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: python3-grpcio-tools x86_64 1.46.7-10.el9 epel 144 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-08T22:38:27.350 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-packaging noarch 20.9-5.el9 appstream 77 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-protobuf noarch 3.14.0-17.el9 appstream 267 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyparsing noarch 2.4.7-9.el9 baseos 150 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: qatlib x86_64 25.08.0-2.el9 appstream 240 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: qatzip-libs x86_64 1.3.1-1.el9 appstream 66 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: unzip x86_64 6.0-59.el9 baseos 182 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: zip x86_64 3.0-35.el9 baseos 266 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout:Installing weak dependencies: 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: qatlib-service x86_64 25.08.0-2.el9 appstream 37 k 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout:====================================================================================== 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout:Install 135 Packages 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout:Upgrade 2 Packages 2026-03-08T22:38:27.351 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T22:38:27.356 INFO:teuthology.orchestra.run.vm04.stdout:Total download size: 210 M 2026-03-08T22:38:27.357 INFO:teuthology.orchestra.run.vm04.stdout:Downloading Packages: 2026-03-08T22:38:29.500 INFO:teuthology.orchestra.run.vm04.stdout:(1/137): ceph-19.2.3-678.ge911bdeb.el9.x86_64.r 14 kB/s | 6.5 kB 00:00 2026-03-08T22:38:30.303 INFO:teuthology.orchestra.run.vm04.stdout:(2/137): ceph-fuse-19.2.3-678.ge911bdeb.el9.x86 1.4 MB/s | 1.2 MB 00:00 2026-03-08T22:38:30.417 INFO:teuthology.orchestra.run.vm04.stdout:(3/137): ceph-immutable-object-cache-19.2.3-678 1.2 MB/s | 145 kB 00:00 2026-03-08T22:38:30.516 INFO:teuthology.orchestra.run.vm04.stdout:(4/137): ceph-base-19.2.3-678.ge911bdeb.el9.x86 3.7 MB/s | 5.5 MB 00:01 2026-03-08T22:38:30.641 INFO:teuthology.orchestra.run.vm04.stdout:(5/137): ceph-mgr-19.2.3-678.ge911bdeb.el9.x86_ 8.6 MB/s | 1.1 MB 00:00 2026-03-08T22:38:30.664 INFO:teuthology.orchestra.run.vm04.stdout:(6/137): ceph-mds-19.2.3-678.ge911bdeb.el9.x86_ 9.8 MB/s | 2.4 MB 00:00 2026-03-08T22:38:30.997 INFO:teuthology.orchestra.run.vm04.stdout:(7/137): ceph-mon-19.2.3-678.ge911bdeb.el9.x86_ 13 MB/s | 4.7 MB 00:00 2026-03-08T22:38:31.605 INFO:teuthology.orchestra.run.vm04.stdout:(8/137): ceph-common-19.2.3-678.ge911bdeb.el9.x 8.5 MB/s | 22 MB 00:02 2026-03-08T22:38:31.720 INFO:teuthology.orchestra.run.vm04.stdout:(9/137): ceph-radosgw-19.2.3-678.ge911bdeb.el9. 15 MB/s | 11 MB 00:00 2026-03-08T22:38:31.779 INFO:teuthology.orchestra.run.vm04.stdout:(10/137): ceph-osd-19.2.3-678.ge911bdeb.el9.x86 15 MB/s | 17 MB 00:01 2026-03-08T22:38:31.779 INFO:teuthology.orchestra.run.vm04.stdout:(11/137): ceph-selinux-19.2.3-678.ge911bdeb.el9 142 kB/s | 25 kB 00:00 2026-03-08T22:38:31.891 INFO:teuthology.orchestra.run.vm04.stdout:(12/137): libcephfs-devel-19.2.3-678.ge911bdeb. 299 kB/s | 34 kB 00:00 2026-03-08T22:38:32.006 INFO:teuthology.orchestra.run.vm04.stdout:(13/137): libcephsqlite-19.2.3-678.ge911bdeb.el 1.4 MB/s | 163 kB 00:00 2026-03-08T22:38:32.022 INFO:teuthology.orchestra.run.vm04.stdout:(14/137): libcephfs2-19.2.3-678.ge911bdeb.el9.x 4.0 MB/s | 1.0 MB 00:00 2026-03-08T22:38:32.120 INFO:teuthology.orchestra.run.vm04.stdout:(15/137): librados-devel-19.2.3-678.ge911bdeb.e 1.1 MB/s | 127 kB 00:00 2026-03-08T22:38:32.144 INFO:teuthology.orchestra.run.vm04.stdout:(16/137): libradosstriper1-19.2.3-678.ge911bdeb 4.0 MB/s | 503 kB 00:00 2026-03-08T22:38:32.261 INFO:teuthology.orchestra.run.vm04.stdout:(17/137): python3-ceph-argparse-19.2.3-678.ge91 383 kB/s | 45 kB 00:00 2026-03-08T22:38:32.380 INFO:teuthology.orchestra.run.vm04.stdout:(18/137): python3-ceph-common-19.2.3-678.ge911b 1.2 MB/s | 142 kB 00:00 2026-03-08T22:38:32.491 INFO:teuthology.orchestra.run.vm04.stdout:(19/137): librgw2-19.2.3-678.ge911bdeb.el9.x86_ 15 MB/s | 5.4 MB 00:00 2026-03-08T22:38:32.499 INFO:teuthology.orchestra.run.vm04.stdout:(20/137): python3-cephfs-19.2.3-678.ge911bdeb.e 1.4 MB/s | 165 kB 00:00 2026-03-08T22:38:32.607 INFO:teuthology.orchestra.run.vm04.stdout:(21/137): python3-rados-19.2.3-678.ge911bdeb.el 2.7 MB/s | 323 kB 00:00 2026-03-08T22:38:32.619 INFO:teuthology.orchestra.run.vm04.stdout:(22/137): python3-rbd-19.2.3-678.ge911bdeb.el9. 2.5 MB/s | 303 kB 00:00 2026-03-08T22:38:32.720 INFO:teuthology.orchestra.run.vm04.stdout:(23/137): python3-rgw-19.2.3-678.ge911bdeb.el9. 882 kB/s | 100 kB 00:00 2026-03-08T22:38:32.736 INFO:teuthology.orchestra.run.vm04.stdout:(24/137): rbd-fuse-19.2.3-678.ge911bdeb.el9.x86 726 kB/s | 85 kB 00:00 2026-03-08T22:38:32.883 INFO:teuthology.orchestra.run.vm04.stdout:(25/137): rbd-nbd-19.2.3-678.ge911bdeb.el9.x86_ 1.1 MB/s | 171 kB 00:00 2026-03-08T22:38:32.969 INFO:teuthology.orchestra.run.vm04.stdout:(26/137): rbd-mirror-19.2.3-678.ge911bdeb.el9.x 13 MB/s | 3.1 MB 00:00 2026-03-08T22:38:32.999 INFO:teuthology.orchestra.run.vm04.stdout:(27/137): ceph-grafana-dashboards-19.2.3-678.ge 267 kB/s | 31 kB 00:00 2026-03-08T22:38:33.084 INFO:teuthology.orchestra.run.vm04.stdout:(28/137): ceph-mgr-cephadm-19.2.3-678.ge911bdeb 1.3 MB/s | 150 kB 00:00 2026-03-08T22:38:33.382 INFO:teuthology.orchestra.run.vm04.stdout:(29/137): ceph-mgr-dashboard-19.2.3-678.ge911bd 9.9 MB/s | 3.8 MB 00:00 2026-03-08T22:38:33.504 INFO:teuthology.orchestra.run.vm04.stdout:(30/137): ceph-mgr-modules-core-19.2.3-678.ge91 2.0 MB/s | 253 kB 00:00 2026-03-08T22:38:33.572 INFO:teuthology.orchestra.run.vm04.stdout:(31/137): ceph-mgr-diskprediction-local-19.2.3- 15 MB/s | 7.4 MB 00:00 2026-03-08T22:38:33.622 INFO:teuthology.orchestra.run.vm04.stdout:(32/137): ceph-mgr-rook-19.2.3-678.ge911bdeb.el 420 kB/s | 49 kB 00:00 2026-03-08T22:38:33.690 INFO:teuthology.orchestra.run.vm04.stdout:(33/137): ceph-prometheus-alerts-19.2.3-678.ge9 142 kB/s | 17 kB 00:00 2026-03-08T22:38:33.742 INFO:teuthology.orchestra.run.vm04.stdout:(34/137): ceph-volume-19.2.3-678.ge911bdeb.el9. 2.4 MB/s | 299 kB 00:00 2026-03-08T22:38:33.818 INFO:teuthology.orchestra.run.vm04.stdout:(35/137): cephadm-19.2.3-678.ge911bdeb.el9.noar 5.9 MB/s | 769 kB 00:00 2026-03-08T22:38:33.945 INFO:teuthology.orchestra.run.vm04.stdout:(36/137): ledmon-libs-1.1.0-3.el9.x86_64.rpm 318 kB/s | 40 kB 00:00 2026-03-08T22:38:33.994 INFO:teuthology.orchestra.run.vm04.stdout:(37/137): cryptsetup-2.8.1-3.el9.x86_64.rpm 1.4 MB/s | 351 kB 00:00 2026-03-08T22:38:34.049 INFO:teuthology.orchestra.run.vm04.stdout:(38/137): libconfig-1.7.2-9.el9.x86_64.rpm 698 kB/s | 72 kB 00:00 2026-03-08T22:38:34.225 INFO:teuthology.orchestra.run.vm04.stdout:(39/137): libgfortran-11.5.0-14.el9.x86_64.rpm 3.4 MB/s | 794 kB 00:00 2026-03-08T22:38:34.274 INFO:teuthology.orchestra.run.vm04.stdout:(40/137): libquadmath-11.5.0-14.el9.x86_64.rpm 820 kB/s | 184 kB 00:00 2026-03-08T22:38:34.374 INFO:teuthology.orchestra.run.vm04.stdout:(41/137): mailcap-2.1.49-5.el9.noarch.rpm 223 kB/s | 33 kB 00:00 2026-03-08T22:38:34.386 INFO:teuthology.orchestra.run.vm04.stdout:(42/137): pciutils-3.7.0-7.el9.x86_64.rpm 834 kB/s | 93 kB 00:00 2026-03-08T22:38:34.461 INFO:teuthology.orchestra.run.vm04.stdout:(43/137): python3-cffi-1.14.5-5.el9.x86_64.rpm 2.9 MB/s | 253 kB 00:00 2026-03-08T22:38:34.507 INFO:teuthology.orchestra.run.vm04.stdout:(44/137): python3-ply-3.11-14.el9.noarch.rpm 2.3 MB/s | 106 kB 00:00 2026-03-08T22:38:34.574 INFO:teuthology.orchestra.run.vm04.stdout:(45/137): python3-pycparser-2.20-6.el9.noarch.r 2.0 MB/s | 135 kB 00:00 2026-03-08T22:38:34.638 INFO:teuthology.orchestra.run.vm04.stdout:(46/137): python3-cryptography-36.0.1-5.el9.x86 5.0 MB/s | 1.2 MB 00:00 2026-03-08T22:38:34.642 INFO:teuthology.orchestra.run.vm04.stdout:(47/137): python3-pyparsing-2.4.7-9.el9.noarch. 2.2 MB/s | 150 kB 00:00 2026-03-08T22:38:34.699 INFO:teuthology.orchestra.run.vm04.stdout:(48/137): python3-requests-2.25.1-10.el9.noarch 2.0 MB/s | 126 kB 00:00 2026-03-08T22:38:34.711 INFO:teuthology.orchestra.run.vm04.stdout:(49/137): python3-urllib3-1.26.5-7.el9.noarch.r 3.1 MB/s | 218 kB 00:00 2026-03-08T22:38:34.867 INFO:teuthology.orchestra.run.vm04.stdout:(50/137): ceph-test-19.2.3-678.ge911bdeb.el9.x8 16 MB/s | 50 MB 00:03 2026-03-08T22:38:34.868 INFO:teuthology.orchestra.run.vm04.stdout:(51/137): unzip-6.0-59.el9.x86_64.rpm 1.0 MB/s | 182 kB 00:00 2026-03-08T22:38:34.869 INFO:teuthology.orchestra.run.vm04.stdout:(52/137): zip-3.0-35.el9.x86_64.rpm 1.6 MB/s | 266 kB 00:00 2026-03-08T22:38:35.300 INFO:teuthology.orchestra.run.vm04.stdout:(53/137): flexiblas-3.0.4-9.el9.x86_64.rpm 69 kB/s | 30 kB 00:00 2026-03-08T22:38:35.571 INFO:teuthology.orchestra.run.vm04.stdout:(54/137): flexiblas-openblas-openmp-3.0.4-9.el9 55 kB/s | 15 kB 00:00 2026-03-08T22:38:35.624 INFO:teuthology.orchestra.run.vm04.stdout:(55/137): boost-program-options-1.75.0-13.el9.x 137 kB/s | 104 kB 00:00 2026-03-08T22:38:36.204 INFO:teuthology.orchestra.run.vm04.stdout:(56/137): libpmemobj-1.12.1-1.el9.x86_64.rpm 277 kB/s | 160 kB 00:00 2026-03-08T22:38:36.215 INFO:teuthology.orchestra.run.vm04.stdout:(57/137): flexiblas-netlib-3.0.4-9.el9.x86_64.r 2.2 MB/s | 3.0 MB 00:01 2026-03-08T22:38:36.589 INFO:teuthology.orchestra.run.vm04.stdout:(58/137): librabbitmq-0.11.0-7.el9.x86_64.rpm 117 kB/s | 45 kB 00:00 2026-03-08T22:38:36.704 INFO:teuthology.orchestra.run.vm04.stdout:(59/137): libnbd-1.20.3-4.el9.x86_64.rpm 145 kB/s | 164 kB 00:01 2026-03-08T22:38:36.824 INFO:teuthology.orchestra.run.vm04.stdout:(60/137): libstoragemgmt-1.10.1-1.el9.x86_64.rp 1.0 MB/s | 246 kB 00:00 2026-03-08T22:38:37.691 INFO:teuthology.orchestra.run.vm04.stdout:(61/137): libxslt-1.1.34-12.el9.x86_64.rpm 236 kB/s | 233 kB 00:00 2026-03-08T22:38:38.426 INFO:teuthology.orchestra.run.vm04.stdout:(62/137): lttng-ust-2.12.0-6.el9.x86_64.rpm 183 kB/s | 292 kB 00:01 2026-03-08T22:38:38.517 INFO:teuthology.orchestra.run.vm04.stdout:(63/137): librdkafka-1.6.1-102.el9.x86_64.rpm 288 kB/s | 662 kB 00:02 2026-03-08T22:38:38.626 INFO:teuthology.orchestra.run.vm04.stdout:(64/137): lua-5.4.4-4.el9.x86_64.rpm 202 kB/s | 188 kB 00:00 2026-03-08T22:38:39.806 INFO:teuthology.orchestra.run.vm04.stdout:(65/137): openblas-0.3.29-1.el9.x86_64.rpm 30 kB/s | 42 kB 00:01 2026-03-08T22:38:43.004 INFO:teuthology.orchestra.run.vm04.stdout:(66/137): protobuf-3.14.0-17.el9.x86_64.rpm 235 kB/s | 1.0 MB 00:04 2026-03-08T22:38:44.096 INFO:teuthology.orchestra.run.vm04.stdout:(67/137): python3-babel-2.9.1-2.el9.noarch.rpm 1.4 MB/s | 6.0 MB 00:04 2026-03-08T22:38:44.661 INFO:teuthology.orchestra.run.vm04.stdout:(68/137): python3-devel-3.9.25-3.el9.x86_64.rpm 148 kB/s | 244 kB 00:01 2026-03-08T22:38:44.947 INFO:teuthology.orchestra.run.vm04.stdout:(69/137): python3-jinja2-2.11.3-8.el9.noarch.rp 292 kB/s | 249 kB 00:00 2026-03-08T22:38:45.666 INFO:teuthology.orchestra.run.vm04.stdout:(70/137): python3-jmespath-1.0.1-1.el9.noarch.r 47 kB/s | 48 kB 00:01 2026-03-08T22:38:45.772 INFO:teuthology.orchestra.run.vm04.stdout:(71/137): python3-libstoragemgmt-1.10.1-1.el9.x 214 kB/s | 177 kB 00:00 2026-03-08T22:38:46.177 INFO:teuthology.orchestra.run.vm04.stdout:(72/137): python3-markupsafe-1.1.1-12.el9.x86_6 86 kB/s | 35 kB 00:00 2026-03-08T22:38:47.716 INFO:teuthology.orchestra.run.vm04.stdout:(73/137): python3-mako-1.1.4-6.el9.noarch.rpm 84 kB/s | 172 kB 00:02 2026-03-08T22:38:48.374 INFO:teuthology.orchestra.run.vm04.stdout:(74/137): openblas-openmp-0.3.29-1.el9.x86_64.r 549 kB/s | 5.3 MB 00:09 2026-03-08T22:38:48.605 INFO:teuthology.orchestra.run.vm04.stdout:(75/137): python3-packaging-20.9-5.el9.noarch.r 335 kB/s | 77 kB 00:00 2026-03-08T22:38:49.004 INFO:teuthology.orchestra.run.vm04.stdout:(76/137): python3-numpy-f2py-1.23.5-2.el9.x86_6 343 kB/s | 442 kB 00:01 2026-03-08T22:38:49.405 INFO:teuthology.orchestra.run.vm04.stdout:(77/137): python3-numpy-1.23.5-2.el9.x86_64.rpm 1.9 MB/s | 6.1 MB 00:03 2026-03-08T22:38:49.867 INFO:teuthology.orchestra.run.vm04.stdout:(78/137): python3-pyasn1-0.4.8-7.el9.noarch.rpm 182 kB/s | 157 kB 00:00 2026-03-08T22:38:50.166 INFO:teuthology.orchestra.run.vm04.stdout:(79/137): python3-protobuf-3.14.0-17.el9.noarch 171 kB/s | 267 kB 00:01 2026-03-08T22:38:50.258 INFO:teuthology.orchestra.run.vm04.stdout:(80/137): python3-requests-oauthlib-1.3.0-12.el 137 kB/s | 54 kB 00:00 2026-03-08T22:38:50.299 INFO:teuthology.orchestra.run.vm04.stdout:(81/137): python3-pyasn1-modules-0.4.8-7.el9.no 310 kB/s | 277 kB 00:00 2026-03-08T22:38:50.404 INFO:teuthology.orchestra.run.vm04.stdout:(82/137): python3-toml-0.10.2-6.el9.noarch.rpm 286 kB/s | 42 kB 00:00 2026-03-08T22:38:50.781 INFO:teuthology.orchestra.run.vm04.stdout:(83/137): qatlib-service-25.08.0-2.el9.x86_64.r 98 kB/s | 37 kB 00:00 2026-03-08T22:38:50.849 INFO:teuthology.orchestra.run.vm04.stdout:(84/137): qatlib-25.08.0-2.el9.x86_64.rpm 436 kB/s | 240 kB 00:00 2026-03-08T22:38:50.863 INFO:teuthology.orchestra.run.vm04.stdout:(85/137): qatzip-libs-1.3.1-1.el9.x86_64.rpm 818 kB/s | 66 kB 00:00 2026-03-08T22:38:51.105 INFO:teuthology.orchestra.run.vm04.stdout:(86/137): xmlstarlet-1.6.1-20.el9.x86_64.rpm 263 kB/s | 64 kB 00:00 2026-03-08T22:38:51.249 INFO:teuthology.orchestra.run.vm04.stdout:(87/137): lua-devel-5.4.4-4.el9.x86_64.rpm 156 kB/s | 22 kB 00:00 2026-03-08T22:38:51.489 INFO:teuthology.orchestra.run.vm04.stdout:(88/137): protobuf-compiler-3.14.0-17.el9.x86_6 3.5 MB/s | 862 kB 00:00 2026-03-08T22:38:51.509 INFO:teuthology.orchestra.run.vm04.stdout:(89/137): abseil-cpp-20211102.0-4.el9.x86_64.rp 27 MB/s | 551 kB 00:00 2026-03-08T22:38:51.517 INFO:teuthology.orchestra.run.vm04.stdout:(90/137): gperftools-libs-2.9.1-3.el9.x86_64.rp 38 MB/s | 308 kB 00:00 2026-03-08T22:38:51.519 INFO:teuthology.orchestra.run.vm04.stdout:(91/137): grpc-data-1.46.7-10.el9.noarch.rpm 9.9 MB/s | 19 kB 00:00 2026-03-08T22:38:51.581 INFO:teuthology.orchestra.run.vm04.stdout:(92/137): libarrow-9.0.0-15.el9.x86_64.rpm 72 MB/s | 4.4 MB 00:00 2026-03-08T22:38:51.584 INFO:teuthology.orchestra.run.vm04.stdout:(93/137): libarrow-doc-9.0.0-15.el9.noarch.rpm 9.0 MB/s | 25 kB 00:00 2026-03-08T22:38:51.587 INFO:teuthology.orchestra.run.vm04.stdout:(94/137): liboath-2.6.12-1.el9.x86_64.rpm 19 MB/s | 49 kB 00:00 2026-03-08T22:38:51.590 INFO:teuthology.orchestra.run.vm04.stdout:(95/137): libunwind-1.6.2-1.el9.x86_64.rpm 24 MB/s | 67 kB 00:00 2026-03-08T22:38:51.594 INFO:teuthology.orchestra.run.vm04.stdout:(96/137): luarocks-3.9.2-5.el9.noarch.rpm 37 MB/s | 151 kB 00:00 2026-03-08T22:38:51.605 INFO:teuthology.orchestra.run.vm04.stdout:(97/137): parquet-libs-9.0.0-15.el9.x86_64.rpm 76 MB/s | 838 kB 00:00 2026-03-08T22:38:51.614 INFO:teuthology.orchestra.run.vm04.stdout:(98/137): python3-asyncssh-2.13.2-5.el9.noarch. 63 MB/s | 548 kB 00:00 2026-03-08T22:38:51.617 INFO:teuthology.orchestra.run.vm04.stdout:(99/137): socat-1.7.4.1-8.el9.x86_64.rpm 395 kB/s | 303 kB 00:00 2026-03-08T22:38:51.617 INFO:teuthology.orchestra.run.vm04.stdout:(100/137): python3-autocommand-2.2.2-8.el9.noar 8.4 MB/s | 29 kB 00:00 2026-03-08T22:38:51.619 INFO:teuthology.orchestra.run.vm04.stdout:(101/137): python3-backports-tarfile-1.2.0-1.el 25 MB/s | 60 kB 00:00 2026-03-08T22:38:51.622 INFO:teuthology.orchestra.run.vm04.stdout:(102/137): python3-cachetools-4.2.4-1.el9.noarc 13 MB/s | 32 kB 00:00 2026-03-08T22:38:51.623 INFO:teuthology.orchestra.run.vm04.stdout:(103/137): python3-bcrypt-3.2.2-1.el9.x86_64.rp 8.3 MB/s | 43 kB 00:00 2026-03-08T22:38:51.624 INFO:teuthology.orchestra.run.vm04.stdout:(104/137): python3-certifi-2023.05.07-4.el9.noa 6.4 MB/s | 14 kB 00:00 2026-03-08T22:38:51.631 INFO:teuthology.orchestra.run.vm04.stdout:(105/137): python3-cheroot-10.0.1-4.el9.noarch. 20 MB/s | 173 kB 00:00 2026-03-08T22:38:51.633 INFO:teuthology.orchestra.run.vm04.stdout:(106/137): python3-cherrypy-18.6.1-2.el9.noarch 43 MB/s | 358 kB 00:00 2026-03-08T22:38:51.640 INFO:teuthology.orchestra.run.vm04.stdout:(107/137): python3-google-auth-2.45.0-1.el9.noa 37 MB/s | 254 kB 00:00 2026-03-08T22:38:51.645 INFO:teuthology.orchestra.run.vm04.stdout:(108/137): python3-grpcio-tools-1.46.7-10.el9.x 28 MB/s | 144 kB 00:00 2026-03-08T22:38:51.648 INFO:teuthology.orchestra.run.vm04.stdout:(109/137): python3-jaraco-8.2.1-3.el9.noarch.rp 3.9 MB/s | 11 kB 00:00 2026-03-08T22:38:51.651 INFO:teuthology.orchestra.run.vm04.stdout:(110/137): python3-jaraco-classes-3.2.1-5.el9.n 5.6 MB/s | 18 kB 00:00 2026-03-08T22:38:51.655 INFO:teuthology.orchestra.run.vm04.stdout:(111/137): python3-jaraco-collections-3.0.0-8.e 6.6 MB/s | 23 kB 00:00 2026-03-08T22:38:51.661 INFO:teuthology.orchestra.run.vm04.stdout:(112/137): python3-grpcio-1.46.7-10.el9.x86_64. 73 MB/s | 2.0 MB 00:00 2026-03-08T22:38:51.662 INFO:teuthology.orchestra.run.vm04.stdout:(113/137): python3-jaraco-context-6.0.1-3.el9.n 2.7 MB/s | 20 kB 00:00 2026-03-08T22:38:51.663 INFO:teuthology.orchestra.run.vm04.stdout:(114/137): python3-jaraco-functools-3.5.0-2.el9 10 MB/s | 19 kB 00:00 2026-03-08T22:38:51.664 INFO:teuthology.orchestra.run.vm04.stdout:(115/137): python3-jaraco-text-4.0.0-2.el9.noar 13 MB/s | 26 kB 00:00 2026-03-08T22:38:51.670 INFO:teuthology.orchestra.run.vm04.stdout:(116/137): python3-logutils-0.3.5-21.el9.noarch 8.4 MB/s | 46 kB 00:00 2026-03-08T22:38:51.674 INFO:teuthology.orchestra.run.vm04.stdout:(117/137): python3-more-itertools-8.12.0-2.el9. 17 MB/s | 79 kB 00:00 2026-03-08T22:38:51.679 INFO:teuthology.orchestra.run.vm04.stdout:(118/137): python3-kubernetes-26.1.0-3.el9.noar 67 MB/s | 1.0 MB 00:00 2026-03-08T22:38:51.680 INFO:teuthology.orchestra.run.vm04.stdout:(119/137): python3-natsort-7.1.1-5.el9.noarch.r 11 MB/s | 58 kB 00:00 2026-03-08T22:38:51.684 INFO:teuthology.orchestra.run.vm04.stdout:(120/137): python3-pecan-1.4.2-3.el9.noarch.rpm 53 MB/s | 272 kB 00:00 2026-03-08T22:38:51.685 INFO:teuthology.orchestra.run.vm04.stdout:(121/137): python3-portend-3.1.0-2.el9.noarch.r 3.3 MB/s | 16 kB 00:00 2026-03-08T22:38:51.687 INFO:teuthology.orchestra.run.vm04.stdout:(122/137): python3-pyOpenSSL-21.0.0-1.el9.noarc 32 MB/s | 90 kB 00:00 2026-03-08T22:38:51.688 INFO:teuthology.orchestra.run.vm04.stdout:(123/137): python3-repoze-lru-0.7-16.el9.noarch 11 MB/s | 31 kB 00:00 2026-03-08T22:38:51.692 INFO:teuthology.orchestra.run.vm04.stdout:(124/137): python3-routes-2.5.1-5.el9.noarch.rp 41 MB/s | 188 kB 00:00 2026-03-08T22:38:51.693 INFO:teuthology.orchestra.run.vm04.stdout:(125/137): python3-rsa-4.9-2.el9.noarch.rpm 12 MB/s | 59 kB 00:00 2026-03-08T22:38:51.694 INFO:teuthology.orchestra.run.vm04.stdout:(126/137): python3-tempora-5.0.0-2.el9.noarch.r 17 MB/s | 36 kB 00:00 2026-03-08T22:38:51.695 INFO:teuthology.orchestra.run.vm04.stdout:(127/137): python3-typing-extensions-4.15.0-1.e 32 MB/s | 86 kB 00:00 2026-03-08T22:38:51.699 INFO:teuthology.orchestra.run.vm04.stdout:(128/137): python3-webob-1.8.8-2.el9.noarch.rpm 51 MB/s | 230 kB 00:00 2026-03-08T22:38:51.700 INFO:teuthology.orchestra.run.vm04.stdout:(129/137): python3-websocket-client-1.2.3-2.el9 17 MB/s | 90 kB 00:00 2026-03-08T22:38:51.704 INFO:teuthology.orchestra.run.vm04.stdout:(130/137): python3-xmltodict-0.12.0-15.el9.noar 5.9 MB/s | 22 kB 00:00 2026-03-08T22:38:51.706 INFO:teuthology.orchestra.run.vm04.stdout:(131/137): python3-werkzeug-2.0.3-3.el9.1.noarc 58 MB/s | 427 kB 00:00 2026-03-08T22:38:51.707 INFO:teuthology.orchestra.run.vm04.stdout:(132/137): python3-zc-lockfile-2.0-10.el9.noarc 9.0 MB/s | 20 kB 00:00 2026-03-08T22:38:51.725 INFO:teuthology.orchestra.run.vm04.stdout:(133/137): re2-20211101-20.el9.x86_64.rpm 10 MB/s | 191 kB 00:00 2026-03-08T22:38:51.729 INFO:teuthology.orchestra.run.vm04.stdout:(134/137): thrift-0.15.0-4.el9.x86_64.rpm 71 MB/s | 1.6 MB 00:00 2026-03-08T22:38:52.669 INFO:teuthology.orchestra.run.vm04.stdout:(135/137): librados2-19.2.3-678.ge911bdeb.el9.x 3.6 MB/s | 3.4 MB 00:00 2026-03-08T22:38:52.680 INFO:teuthology.orchestra.run.vm04.stdout:(136/137): librbd1-19.2.3-678.ge911bdeb.el9.x86 3.3 MB/s | 3.2 MB 00:00 2026-03-08T22:38:55.925 INFO:teuthology.orchestra.run.vm04.stdout:(137/137): python3-scipy-1.9.3-2.el9.x86_64.rpm 3.3 MB/s | 19 MB 00:05 2026-03-08T22:38:55.927 INFO:teuthology.orchestra.run.vm04.stdout:-------------------------------------------------------------------------------- 2026-03-08T22:38:55.927 INFO:teuthology.orchestra.run.vm04.stdout:Total 7.4 MB/s | 210 MB 00:28 2026-03-08T22:38:56.556 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-08T22:38:56.606 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-08T22:38:56.606 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-08T22:38:57.459 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-08T22:38:57.459 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-08T22:38:58.383 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-08T22:38:58.398 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 1/139 2026-03-08T22:38:58.410 INFO:teuthology.orchestra.run.vm04.stdout: Installing : thrift-0.15.0-4.el9.x86_64 2/139 2026-03-08T22:38:58.590 INFO:teuthology.orchestra.run.vm04.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/139 2026-03-08T22:38:58.593 INFO:teuthology.orchestra.run.vm04.stdout: Upgrading : librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 4/139 2026-03-08T22:38:58.664 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 4/139 2026-03-08T22:38:58.667 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 5/139 2026-03-08T22:38:58.703 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 5/139 2026-03-08T22:38:58.713 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-rados-2:19.2.3-678.ge911bdeb.el9.x86_64 6/139 2026-03-08T22:38:58.783 INFO:teuthology.orchestra.run.vm04.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/139 2026-03-08T22:38:58.876 INFO:teuthology.orchestra.run.vm04.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/139 2026-03-08T22:38:58.903 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/139 2026-03-08T22:38:59.124 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libnbd-1.20.3-4.el9.x86_64 10/139 2026-03-08T22:38:59.126 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 11/139 2026-03-08T22:38:59.188 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 11/139 2026-03-08T22:38:59.199 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_ 12/139 2026-03-08T22:38:59.229 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_ 12/139 2026-03-08T22:38:59.264 INFO:teuthology.orchestra.run.vm04.stdout: Installing : re2-1:20211101-20.el9.x86_64 13/139 2026-03-08T22:38:59.304 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 14/139 2026-03-08T22:38:59.421 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 15/139 2026-03-08T22:38:59.458 INFO:teuthology.orchestra.run.vm04.stdout: Installing : liboath-2.6.12-1.el9.x86_64 16/139 2026-03-08T22:38:59.518 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 17/139 2026-03-08T22:38:59.532 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 18/139 2026-03-08T22:38:59.540 INFO:teuthology.orchestra.run.vm04.stdout: Installing : protobuf-3.14.0-17.el9.x86_64 19/139 2026-03-08T22:38:59.546 INFO:teuthology.orchestra.run.vm04.stdout: Installing : lua-5.4.4-4.el9.x86_64 20/139 2026-03-08T22:38:59.553 INFO:teuthology.orchestra.run.vm04.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 21/139 2026-03-08T22:38:59.683 INFO:teuthology.orchestra.run.vm04.stdout: Installing : unzip-6.0-59.el9.x86_64 22/139 2026-03-08T22:38:59.700 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 23/139 2026-03-08T22:38:59.705 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 24/139 2026-03-08T22:38:59.714 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 25/139 2026-03-08T22:38:59.717 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 26/139 2026-03-08T22:38:59.748 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 27/139 2026-03-08T22:38:59.755 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-ceph-common-2:19.2.3-678.ge911bdeb.el9.x 28/139 2026-03-08T22:38:59.766 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-ceph-argparse-2:19.2.3-678.ge911bdeb.el9 29/139 2026-03-08T22:38:59.781 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cephfs-2:19.2.3-678.ge911bdeb.el9.x86_64 30/139 2026-03-08T22:38:59.790 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 31/139 2026-03-08T22:38:59.821 INFO:teuthology.orchestra.run.vm04.stdout: Installing : zip-3.0-35.el9.x86_64 32/139 2026-03-08T22:38:59.828 INFO:teuthology.orchestra.run.vm04.stdout: Installing : luarocks-3.9.2-5.el9.noarch 33/139 2026-03-08T22:38:59.837 INFO:teuthology.orchestra.run.vm04.stdout: Installing : lua-devel-5.4.4-4.el9.x86_64 34/139 2026-03-08T22:38:59.868 INFO:teuthology.orchestra.run.vm04.stdout: Installing : protobuf-compiler-3.14.0-17.el9.x86_64 35/139 2026-03-08T22:38:59.931 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 36/139 2026-03-08T22:38:59.950 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 37/139 2026-03-08T22:38:59.959 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-rsa-4.9-2.el9.noarch 38/139 2026-03-08T22:38:59.970 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 39/139 2026-03-08T22:38:59.977 INFO:teuthology.orchestra.run.vm04.stdout: Installing : librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 40/139 2026-03-08T22:38:59.982 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 41/139 2026-03-08T22:39:00.003 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 42/139 2026-03-08T22:39:00.031 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 43/139 2026-03-08T22:39:00.038 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 44/139 2026-03-08T22:39:00.044 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 45/139 2026-03-08T22:39:00.063 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 46/139 2026-03-08T22:39:00.078 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 47/139 2026-03-08T22:39:00.091 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 48/139 2026-03-08T22:39:00.156 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 49/139 2026-03-08T22:39:00.168 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 50/139 2026-03-08T22:39:00.179 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 51/139 2026-03-08T22:39:00.232 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 52/139 2026-03-08T22:39:00.629 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 53/139 2026-03-08T22:39:00.647 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 54/139 2026-03-08T22:39:00.687 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 55/139 2026-03-08T22:39:00.697 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 56/139 2026-03-08T22:39:00.702 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 57/139 2026-03-08T22:39:00.711 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 58/139 2026-03-08T22:39:00.716 INFO:teuthology.orchestra.run.vm04.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 59/139 2026-03-08T22:39:00.718 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 60/139 2026-03-08T22:39:00.750 INFO:teuthology.orchestra.run.vm04.stdout: Installing : grpc-data-1.46.7-10.el9.noarch 61/139 2026-03-08T22:39:00.810 INFO:teuthology.orchestra.run.vm04.stdout: Installing : abseil-cpp-20211102.0-4.el9.x86_64 62/139 2026-03-08T22:39:00.829 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-grpcio-1.46.7-10.el9.x86_64 63/139 2026-03-08T22:39:00.837 INFO:teuthology.orchestra.run.vm04.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 64/139 2026-03-08T22:39:00.843 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 65/139 2026-03-08T22:39:00.851 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 66/139 2026-03-08T22:39:00.857 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 67/139 2026-03-08T22:39:00.867 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 68/139 2026-03-08T22:39:00.874 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 69/139 2026-03-08T22:39:00.910 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 70/139 2026-03-08T22:39:00.925 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-protobuf-3.14.0-17.el9.noarch 71/139 2026-03-08T22:39:00.969 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-grpcio-tools-1.46.7-10.el9.x86_64 72/139 2026-03-08T22:39:01.255 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 73/139 2026-03-08T22:39:01.289 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 74/139 2026-03-08T22:39:01.296 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 75/139 2026-03-08T22:39:01.360 INFO:teuthology.orchestra.run.vm04.stdout: Installing : openblas-0.3.29-1.el9.x86_64 76/139 2026-03-08T22:39:01.363 INFO:teuthology.orchestra.run.vm04.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 77/139 2026-03-08T22:39:01.388 INFO:teuthology.orchestra.run.vm04.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 78/139 2026-03-08T22:39:01.795 INFO:teuthology.orchestra.run.vm04.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 79/139 2026-03-08T22:39:01.895 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 80/139 2026-03-08T22:39:02.751 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 81/139 2026-03-08T22:39:02.784 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 82/139 2026-03-08T22:39:02.793 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 83/139 2026-03-08T22:39:02.799 INFO:teuthology.orchestra.run.vm04.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 84/139 2026-03-08T22:39:02.967 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 85/139 2026-03-08T22:39:02.970 INFO:teuthology.orchestra.run.vm04.stdout: Upgrading : librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 86/139 2026-03-08T22:39:03.007 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 86/139 2026-03-08T22:39:03.012 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-rbd-2:19.2.3-678.ge911bdeb.el9.x86_64 87/139 2026-03-08T22:39:03.021 INFO:teuthology.orchestra.run.vm04.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 88/139 2026-03-08T22:39:03.309 INFO:teuthology.orchestra.run.vm04.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 89/139 2026-03-08T22:39:03.311 INFO:teuthology.orchestra.run.vm04.stdout: Installing : librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 90/139 2026-03-08T22:39:03.333 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 90/139 2026-03-08T22:39:03.335 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-rgw-2:19.2.3-678.ge911bdeb.el9.x86_64 91/139 2026-03-08T22:39:04.620 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 92/139 2026-03-08T22:39:04.626 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 92/139 2026-03-08T22:39:04.649 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 92/139 2026-03-08T22:39:04.662 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pyparsing-2.4.7-9.el9.noarch 93/139 2026-03-08T22:39:04.672 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-packaging-20.9-5.el9.noarch 94/139 2026-03-08T22:39:04.691 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-ply-3.11-14.el9.noarch 95/139 2026-03-08T22:39:04.711 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 96/139 2026-03-08T22:39:04.807 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 97/139 2026-03-08T22:39:04.822 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 98/139 2026-03-08T22:39:04.855 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 99/139 2026-03-08T22:39:04.896 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 100/139 2026-03-08T22:39:04.970 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 101/139 2026-03-08T22:39:04.981 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 102/139 2026-03-08T22:39:04.987 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 103/139 2026-03-08T22:39:04.994 INFO:teuthology.orchestra.run.vm04.stdout: Installing : pciutils-3.7.0-7.el9.x86_64 104/139 2026-03-08T22:39:04.997 INFO:teuthology.orchestra.run.vm04.stdout: Installing : qatlib-25.08.0-2.el9.x86_64 105/139 2026-03-08T22:39:05.000 INFO:teuthology.orchestra.run.vm04.stdout: Installing : qatlib-service-25.08.0-2.el9.x86_64 106/139 2026-03-08T22:39:05.040 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: qatlib-service-25.08.0-2.el9.x86_64 106/139 2026-03-08T22:39:05.394 INFO:teuthology.orchestra.run.vm04.stdout: Installing : qatzip-libs-1.3.1-1.el9.x86_64 107/139 2026-03-08T22:39:05.413 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 108/139 2026-03-08T22:39:05.470 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 108/139 2026-03-08T22:39:05.470 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-08T22:39:05.470 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-08T22:39:05.470 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T22:39:05.476 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 109/139 2026-03-08T22:39:13.064 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 109/139 2026-03-08T22:39:13.064 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /sys 2026-03-08T22:39:13.064 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /proc 2026-03-08T22:39:13.064 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /mnt 2026-03-08T22:39:13.064 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /var/tmp 2026-03-08T22:39:13.064 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /home 2026-03-08T22:39:13.064 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /root 2026-03-08T22:39:13.064 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /tmp 2026-03-08T22:39:13.064 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T22:39:13.209 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 110/139 2026-03-08T22:39:13.237 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 110/139 2026-03-08T22:39:13.237 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:39:13.237 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-08T22:39:13.237 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-08T22:39:13.237 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-08T22:39:13.237 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T22:39:13.507 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 111/139 2026-03-08T22:39:13.543 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 111/139 2026-03-08T22:39:13.543 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:39:13.543 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-08T22:39:13.543 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-08T22:39:13.543 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-08T22:39:13.543 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T22:39:13.553 INFO:teuthology.orchestra.run.vm04.stdout: Installing : mailcap-2.1.49-5.el9.noarch 112/139 2026-03-08T22:39:13.556 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 113/139 2026-03-08T22:39:13.574 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 114/139 2026-03-08T22:39:13.574 INFO:teuthology.orchestra.run.vm04.stdout:Creating group 'qat' with GID 994. 2026-03-08T22:39:13.574 INFO:teuthology.orchestra.run.vm04.stdout:Creating group 'libstoragemgmt' with GID 993. 2026-03-08T22:39:13.575 INFO:teuthology.orchestra.run.vm04.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 993 and GID 993. 2026-03-08T22:39:13.575 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T22:39:13.586 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 114/139 2026-03-08T22:39:13.617 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 114/139 2026-03-08T22:39:13.617 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-08T22:39:13.617 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T22:39:13.665 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 115/139 2026-03-08T22:39:13.758 INFO:teuthology.orchestra.run.vm04.stdout: Installing : cryptsetup-2.8.1-3.el9.x86_64 116/139 2026-03-08T22:39:13.763 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 117/139 2026-03-08T22:39:13.785 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 117/139 2026-03-08T22:39:13.785 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:39:13.785 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-volume@*.service" escaped as "ceph-volume@\x2a.service". 2026-03-08T22:39:13.785 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T22:39:14.694 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 118/139 2026-03-08T22:39:14.721 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 118/139 2026-03-08T22:39:14.721 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:39:14.721 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-08T22:39:14.721 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-08T22:39:14.721 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-08T22:39:14.721 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T22:39:14.798 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 119/139 2026-03-08T22:39:14.803 INFO:teuthology.orchestra.run.vm04.stdout: Installing : cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 119/139 2026-03-08T22:39:14.811 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-prometheus-alerts-2:19.2.3-678.ge911bdeb.el 120/139 2026-03-08T22:39:14.839 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-grafana-dashboards-2:19.2.3-678.ge911bdeb.e 121/139 2026-03-08T22:39:14.842 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noar 122/139 2026-03-08T22:39:15.461 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noar 122/139 2026-03-08T22:39:15.773 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.no 123/139 2026-03-08T22:39:16.334 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.no 123/139 2026-03-08T22:39:16.337 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-diskprediction-local-2:19.2.3-678.ge911 124/139 2026-03-08T22:39:16.411 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:19.2.3-678.ge911 124/139 2026-03-08T22:39:16.476 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-modules-core-2:19.2.3-678.ge911bdeb.el9 125/139 2026-03-08T22:39:16.478 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 126/139 2026-03-08T22:39:16.502 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 126/139 2026-03-08T22:39:16.502 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:39:16.502 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-08T22:39:16.502 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-08T22:39:16.502 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-08T22:39:16.502 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T22:39:16.516 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 127/139 2026-03-08T22:39:16.530 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 127/139 2026-03-08T22:39:17.086 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-2:19.2.3-678.ge911bdeb.el9.x86_64 128/139 2026-03-08T22:39:17.115 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 129/139 2026-03-08T22:39:17.140 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 129/139 2026-03-08T22:39:17.140 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:39:17.140 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-08T22:39:17.140 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-08T22:39:17.140 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-08T22:39:17.140 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T22:39:17.158 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-immutable-object-cache-2:19.2.3-678.ge911bd 130/139 2026-03-08T22:39:17.185 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-immutable-object-cache-2:19.2.3-678.ge911bd 130/139 2026-03-08T22:39:17.186 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:39:17.186 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-08T22:39:17.186 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T22:39:17.351 INFO:teuthology.orchestra.run.vm04.stdout: Installing : rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 131/139 2026-03-08T22:39:17.376 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 131/139 2026-03-08T22:39:17.376 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:39:17.376 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-08T22:39:17.376 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-08T22:39:17.376 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-08T22:39:17.376 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T22:39:20.141 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-test-2:19.2.3-678.ge911bdeb.el9.x86_64 132/139 2026-03-08T22:39:20.152 INFO:teuthology.orchestra.run.vm04.stdout: Installing : rbd-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 133/139 2026-03-08T22:39:20.158 INFO:teuthology.orchestra.run.vm04.stdout: Installing : rbd-nbd-2:19.2.3-678.ge911bdeb.el9.x86_64 134/139 2026-03-08T22:39:20.218 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libcephfs-devel-2:19.2.3-678.ge911bdeb.el9.x86_6 135/139 2026-03-08T22:39:20.228 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 136/139 2026-03-08T22:39:20.233 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 137/139 2026-03-08T22:39:20.233 INFO:teuthology.orchestra.run.vm04.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 138/139 2026-03-08T22:39:20.252 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 138/139 2026-03-08T22:39:20.252 INFO:teuthology.orchestra.run.vm04.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 139/139 2026-03-08T22:39:21.840 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 139/139 2026-03-08T22:39:21.840 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-2:19.2.3-678.ge911bdeb.el9.x86_64 1/139 2026-03-08T22:39:21.840 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 2/139 2026-03-08T22:39:21.840 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 3/139 2026-03-08T22:39:21.840 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 4/139 2026-03-08T22:39:21.840 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-immutable-object-cache-2:19.2.3-678.ge911bd 5/139 2026-03-08T22:39:21.840 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 6/139 2026-03-08T22:39:21.840 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 7/139 2026-03-08T22:39:21.840 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 8/139 2026-03-08T22:39:21.840 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 9/139 2026-03-08T22:39:21.840 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 10/139 2026-03-08T22:39:21.840 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 11/139 2026-03-08T22:39:21.840 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-test-2:19.2.3-678.ge911bdeb.el9.x86_64 12/139 2026-03-08T22:39:21.840 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephfs-devel-2:19.2.3-678.ge911bdeb.el9.x86_6 13/139 2026-03-08T22:39:21.841 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 14/139 2026-03-08T22:39:21.841 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 15/139 2026-03-08T22:39:21.841 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 16/139 2026-03-08T22:39:21.841 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_ 17/139 2026-03-08T22:39:21.841 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 18/139 2026-03-08T22:39:21.841 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ceph-argparse-2:19.2.3-678.ge911bdeb.el9 19/139 2026-03-08T22:39:21.841 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ceph-common-2:19.2.3-678.ge911bdeb.el9.x 20/139 2026-03-08T22:39:21.841 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cephfs-2:19.2.3-678.ge911bdeb.el9.x86_64 21/139 2026-03-08T22:39:21.841 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rados-2:19.2.3-678.ge911bdeb.el9.x86_64 22/139 2026-03-08T22:39:21.841 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rbd-2:19.2.3-678.ge911bdeb.el9.x86_64 23/139 2026-03-08T22:39:21.841 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rgw-2:19.2.3-678.ge911bdeb.el9.x86_64 24/139 2026-03-08T22:39:21.841 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 25/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 26/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-nbd-2:19.2.3-678.ge911bdeb.el9.x86_64 27/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-grafana-dashboards-2:19.2.3-678.ge911bdeb.e 28/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noar 29/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.no 30/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-diskprediction-local-2:19.2.3-678.ge911 31/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-modules-core-2:19.2.3-678.ge911bdeb.el9 32/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 33/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-prometheus-alerts-2:19.2.3-678.ge911bdeb.el 34/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 35/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 36/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : cryptsetup-2.8.1-3.el9.x86_64 37/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 38/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 39/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 40/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 41/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 42/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : pciutils-3.7.0-7.el9.x86_64 43/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 44/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 45/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ply-3.11-14.el9.noarch 46/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 47/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyparsing-2.4.7-9.el9.noarch 48/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 49/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 50/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : unzip-6.0-59.el9.x86_64 51/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : zip-3.0-35.el9.x86_64 52/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 53/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 54/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 55/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 56/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libnbd-1.20.3-4.el9.x86_64 57/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 58/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 59/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 60/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 61/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 62/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 63/139 2026-03-08T22:39:21.842 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : lua-5.4.4-4.el9.x86_64 64/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 65/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 66/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : protobuf-3.14.0-17.el9.x86_64 67/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 68/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 69/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 70/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 71/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 72/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 73/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 74/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 75/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 76/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-packaging-20.9-5.el9.noarch 77/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-protobuf-3.14.0-17.el9.noarch 78/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 79/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 80/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 81/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 82/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 83/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : qatlib-25.08.0-2.el9.x86_64 84/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : qatlib-service-25.08.0-2.el9.x86_64 85/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : qatzip-libs-1.3.1-1.el9.x86_64 86/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 87/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 88/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : lua-devel-5.4.4-4.el9.x86_64 89/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : protobuf-compiler-3.14.0-17.el9.x86_64 90/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : abseil-cpp-20211102.0-4.el9.x86_64 91/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 92/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : grpc-data-1.46.7-10.el9.noarch 93/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 94/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 95/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 96/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 97/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : luarocks-3.9.2-5.el9.noarch 98/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 99/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 100/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 101/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 102/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 103/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 104/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 105/139 2026-03-08T22:39:21.844 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 106/139 2026-03-08T22:39:21.845 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 107/139 2026-03-08T22:39:21.845 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 108/139 2026-03-08T22:39:21.845 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-grpcio-1.46.7-10.el9.x86_64 109/139 2026-03-08T22:39:21.845 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-grpcio-tools-1.46.7-10.el9.x86_64 110/139 2026-03-08T22:39:21.845 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 111/139 2026-03-08T22:39:21.845 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 112/139 2026-03-08T22:39:21.845 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 113/139 2026-03-08T22:39:21.845 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 114/139 2026-03-08T22:39:21.845 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 115/139 2026-03-08T22:39:21.845 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 116/139 2026-03-08T22:39:21.845 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 117/139 2026-03-08T22:39:21.845 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 118/139 2026-03-08T22:39:21.845 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 119/139 2026-03-08T22:39:21.845 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 120/139 2026-03-08T22:39:21.846 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 121/139 2026-03-08T22:39:21.846 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 122/139 2026-03-08T22:39:21.846 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 123/139 2026-03-08T22:39:21.846 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 124/139 2026-03-08T22:39:21.846 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 125/139 2026-03-08T22:39:21.846 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 126/139 2026-03-08T22:39:21.846 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 127/139 2026-03-08T22:39:21.846 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 128/139 2026-03-08T22:39:21.846 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 129/139 2026-03-08T22:39:21.846 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 130/139 2026-03-08T22:39:21.846 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 131/139 2026-03-08T22:39:21.846 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 132/139 2026-03-08T22:39:21.846 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 133/139 2026-03-08T22:39:21.846 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : re2-1:20211101-20.el9.x86_64 134/139 2026-03-08T22:39:21.846 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 135/139 2026-03-08T22:39:21.846 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 136/139 2026-03-08T22:39:21.846 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 137/139 2026-03-08T22:39:21.846 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 138/139 2026-03-08T22:39:21.947 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 139/139 2026-03-08T22:39:21.947 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T22:39:21.947 INFO:teuthology.orchestra.run.vm04.stdout:Upgraded: 2026-03-08T22:39:21.947 INFO:teuthology.orchestra.run.vm04.stdout: librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.947 INFO:teuthology.orchestra.run.vm04.stdout: librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.947 INFO:teuthology.orchestra.run.vm04.stdout:Installed: 2026-03-08T22:39:21.947 INFO:teuthology.orchestra.run.vm04.stdout: abseil-cpp-20211102.0-4.el9.x86_64 2026-03-08T22:39:21.947 INFO:teuthology.orchestra.run.vm04.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ceph-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ceph-grafana-dashboards-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ceph-immutable-object-cache-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-diskprediction-local-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ceph-prometheus-alerts-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ceph-test-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: cryptsetup-2.8.1-3.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: grpc-data-1.46.7-10.el9.noarch 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: libnbd-1.20.3-4.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: lua-5.4.4-4.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: lua-devel-5.4.4-4.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: luarocks-3.9.2-5.el9.noarch 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: pciutils-3.7.0-7.el9.x86_64 2026-03-08T22:39:21.948 INFO:teuthology.orchestra.run.vm04.stdout: protobuf-3.14.0-17.el9.x86_64 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: protobuf-compiler-3.14.0-17.el9.x86_64 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-cephfs-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-grpcio-1.46.7-10.el9.x86_64 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-grpcio-tools-1.46.7-10.el9.x86_64 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-packaging-20.9-5.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-ply-3.11-14.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-protobuf-3.14.0-17.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyparsing-2.4.7-9.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-rados-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-rbd-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-rgw-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-08T22:39:21.949 INFO:teuthology.orchestra.run.vm04.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-08T22:39:21.950 INFO:teuthology.orchestra.run.vm04.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-08T22:39:21.950 INFO:teuthology.orchestra.run.vm04.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-08T22:39:21.950 INFO:teuthology.orchestra.run.vm04.stdout: qatlib-25.08.0-2.el9.x86_64 2026-03-08T22:39:21.950 INFO:teuthology.orchestra.run.vm04.stdout: qatlib-service-25.08.0-2.el9.x86_64 2026-03-08T22:39:21.950 INFO:teuthology.orchestra.run.vm04.stdout: qatzip-libs-1.3.1-1.el9.x86_64 2026-03-08T22:39:21.950 INFO:teuthology.orchestra.run.vm04.stdout: rbd-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.950 INFO:teuthology.orchestra.run.vm04.stdout: rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.950 INFO:teuthology.orchestra.run.vm04.stdout: rbd-nbd-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:39:21.950 INFO:teuthology.orchestra.run.vm04.stdout: re2-1:20211101-20.el9.x86_64 2026-03-08T22:39:21.950 INFO:teuthology.orchestra.run.vm04.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-08T22:39:21.950 INFO:teuthology.orchestra.run.vm04.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-08T22:39:21.950 INFO:teuthology.orchestra.run.vm04.stdout: unzip-6.0-59.el9.x86_64 2026-03-08T22:39:21.950 INFO:teuthology.orchestra.run.vm04.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-08T22:39:21.950 INFO:teuthology.orchestra.run.vm04.stdout: zip-3.0-35.el9.x86_64 2026-03-08T22:39:21.950 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T22:39:21.950 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T22:39:22.063 DEBUG:teuthology.parallel:result is None 2026-03-08T22:39:22.063 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T22:39:22.659 DEBUG:teuthology.orchestra.run.vm04:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-08T22:39:22.683 INFO:teuthology.orchestra.run.vm04.stdout:19.2.3-678.ge911bdeb.el9 2026-03-08T22:39:22.683 INFO:teuthology.packaging:The installed version of ceph is 19.2.3-678.ge911bdeb.el9 2026-03-08T22:39:22.683 INFO:teuthology.task.install:The correct ceph version 19.2.3-678.ge911bdeb is installed. 2026-03-08T22:39:22.684 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-08T22:39:22.684 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-08T22:39:22.684 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-08T22:39:22.755 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-08T22:39:22.755 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-08T22:39:22.755 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/usr/bin/daemon-helper 2026-03-08T22:39:22.820 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-08T22:39:22.883 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-08T22:39:22.883 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-08T22:39:22.883 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-08T22:39:22.947 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-08T22:39:23.014 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-08T22:39:23.014 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-08T22:39:23.014 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/usr/bin/stdin-killer 2026-03-08T22:39:23.080 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-08T22:39:23.146 INFO:teuthology.run_tasks:Running task workunit... 2026-03-08T22:39:23.150 INFO:tasks.workunit:Pulling workunits from ref 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-08T22:39:23.150 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-08T22:39:23.150 INFO:tasks.workunit:timeout=3h 2026-03-08T22:39:23.150 INFO:tasks.workunit:cleanup=True 2026-03-08T22:39:23.150 DEBUG:teuthology.orchestra.run.vm04:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-08T22:39:23.203 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:39:23.203 INFO:teuthology.orchestra.run.vm04.stderr:stat: cannot statx '/home/ubuntu/cephtest/mnt.0': No such file or directory 2026-03-08T22:39:23.203 DEBUG:teuthology.orchestra.run.vm04:> mkdir -- /home/ubuntu/cephtest/mnt.0 2026-03-08T22:39:23.259 INFO:tasks.workunit:Created dir /home/ubuntu/cephtest/mnt.0 2026-03-08T22:39:23.259 DEBUG:teuthology.orchestra.run.vm04:> cd -- /home/ubuntu/cephtest/mnt.0 && mkdir -- client.0 2026-03-08T22:39:23.314 DEBUG:teuthology.orchestra.run.vm04:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-08T22:39:23.371 INFO:tasks.workunit.client.0.vm04.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-08T22:40:53.582 INFO:tasks.workunit.client.0.vm04.stderr:Note: switching to '569c3e99c9b32a51b4eaf08731c728f4513ed589'. 2026-03-08T22:40:53.582 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-08T22:40:53.582 INFO:tasks.workunit.client.0.vm04.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-08T22:40:53.582 INFO:tasks.workunit.client.0.vm04.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-08T22:40:53.582 INFO:tasks.workunit.client.0.vm04.stderr:state without impacting any branches by switching back to a branch. 2026-03-08T22:40:53.582 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-08T22:40:53.582 INFO:tasks.workunit.client.0.vm04.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-08T22:40:53.582 INFO:tasks.workunit.client.0.vm04.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-08T22:40:53.582 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-08T22:40:53.582 INFO:tasks.workunit.client.0.vm04.stderr: git switch -c 2026-03-08T22:40:53.582 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-08T22:40:53.582 INFO:tasks.workunit.client.0.vm04.stderr:Or undo this operation with: 2026-03-08T22:40:53.582 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-08T22:40:53.582 INFO:tasks.workunit.client.0.vm04.stderr: git switch - 2026-03-08T22:40:53.582 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-08T22:40:53.582 INFO:tasks.workunit.client.0.vm04.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-08T22:40:53.582 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-08T22:40:53.582 INFO:tasks.workunit.client.0.vm04.stderr:HEAD is now at 569c3e99c9b qa/rgw: bucket notifications use pynose 2026-03-08T22:40:53.587 DEBUG:teuthology.orchestra.run.vm04:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/standalone && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-08T22:40:53.644 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-08T22:40:53.644 DEBUG:teuthology.orchestra.run.vm04:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-08T22:40:53.700 INFO:tasks.workunit:Running workunits matching erasure-code on client.0... 2026-03-08T22:40:53.700 INFO:tasks.workunit:Running workunit erasure-code/test-erasure-code-plugins.sh... 2026-03-08T22:40:53.700 DEBUG:teuthology.orchestra.run.vm04:workunit test erasure-code/test-erasure-code-plugins.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh 2026-03-08T22:40:53.759 INFO:tasks.workunit.client.0.vm04.stderr:+ source /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh 2026-03-08T22:40:53.759 INFO:tasks.workunit.client.0.vm04.stderr:++ TIMEOUT=300 2026-03-08T22:40:53.759 INFO:tasks.workunit.client.0.vm04.stderr:++ WAIT_FOR_CLEAN_TIMEOUT=90 2026-03-08T22:40:53.759 INFO:tasks.workunit.client.0.vm04.stderr:++ MAX_TIMEOUT=15 2026-03-08T22:40:53.759 INFO:tasks.workunit.client.0.vm04.stderr:++ PG_NUM=4 2026-03-08T22:40:53.759 INFO:tasks.workunit.client.0.vm04.stderr:++ TMPDIR=/tmp 2026-03-08T22:40:53.759 INFO:tasks.workunit.client.0.vm04.stderr:++ CEPH_BUILD_VIRTUALENV=/tmp 2026-03-08T22:40:53.759 INFO:tasks.workunit.client.0.vm04.stderr:++ TESTDIR=/home/ubuntu/cephtest 2026-03-08T22:40:53.759 INFO:tasks.workunit.client.0.vm04.stderr:++ type xmlstarlet 2026-03-08T22:40:53.759 INFO:tasks.workunit.client.0.vm04.stderr:++ XMLSTARLET=xmlstarlet 2026-03-08T22:40:53.759 INFO:tasks.workunit.client.0.vm04.stderr:+++ uname 2026-03-08T22:40:53.760 INFO:tasks.workunit.client.0.vm04.stderr:++ '[' Linux = FreeBSD ']' 2026-03-08T22:40:53.760 INFO:tasks.workunit.client.0.vm04.stderr:++ SED=sed 2026-03-08T22:40:53.760 INFO:tasks.workunit.client.0.vm04.stderr:++ AWK=awk 2026-03-08T22:40:53.760 INFO:tasks.workunit.client.0.vm04.stderr:+++ stty -a 2026-03-08T22:40:53.760 INFO:tasks.workunit.client.0.vm04.stderr:+++ sed -e 's/.*columns \([0-9]*\).*/\1/' 2026-03-08T22:40:53.760 INFO:tasks.workunit.client.0.vm04.stderr:+++ head -1 2026-03-08T22:40:53.761 INFO:tasks.workunit.client.0.vm04.stderr:stty: 'standard input': Inappropriate ioctl for device 2026-03-08T22:40:53.762 INFO:tasks.workunit.client.0.vm04.stderr:++ termwidth= 2026-03-08T22:40:53.762 INFO:tasks.workunit.client.0.vm04.stderr:++ '[' -n '' -a '' '!=' 0 ']' 2026-03-08T22:40:53.762 INFO:tasks.workunit.client.0.vm04.stderr:++ DIFFCOLOPTS='-y ' 2026-03-08T22:40:53.762 INFO:tasks.workunit.client.0.vm04.stderr:++ KERNCORE=kernel.core_pattern 2026-03-08T22:40:53.762 INFO:tasks.workunit.client.0.vm04.stderr:++ EXTRA_OPTS= 2026-03-08T22:40:53.764 INFO:tasks.workunit.client.0.vm04.stderr:++ test '' = TESTS 2026-03-08T22:40:53.764 INFO:tasks.workunit.client.0.vm04.stderr:++ uname -m 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:+ arch=x86_64 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:+ case $arch in 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:+ legacy_jerasure_plugins=(jerasure_generic jerasure_sse3 jerasure_sse4) 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:+ legacy_shec_plugins=(shec_generic shec_sse3 shec_sse4) 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:+ plugins=(jerasure shec lrc isa) 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:+ main test-erasure-code-plugins 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:+ local dir=td/test-erasure-code-plugins 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:+ shift 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:+ shopt -s -o xtrace 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:+ PS4='${BASH_SOURCE[0]}:$LINENO: ${FUNCNAME[0]}: ' 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: export PATH=.:/home/ubuntu/.local/bin:/home/ubuntu/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/sbin 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: PATH=.:/home/ubuntu/.local/bin:/home/ubuntu/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/sbin 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: export PYTHONWARNINGS=ignore 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: PYTHONWARNINGS=ignore 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: export CEPH_CONF=/dev/null 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: CEPH_CONF=/dev/null 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2373: main: unset CEPH_ARGS 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2375: main: local code 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2376: main: run td/test-erasure-code-plugins 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:26: run: local dir=td/test-erasure-code-plugins 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:27: run: shift 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:29: run: export CEPH_MON=127.0.0.1:17110 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:29: run: CEPH_MON=127.0.0.1:17110 2026-03-08T22:40:53.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:30: run: export CEPH_ARGS 2026-03-08T22:40:53.770 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:31: run: uuidgen 2026-03-08T22:40:53.772 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:31: run: CEPH_ARGS+='--fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none ' 2026-03-08T22:40:53.772 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:32: run: CEPH_ARGS+='--mon-host=127.0.0.1:17110 ' 2026-03-08T22:40:53.773 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:34: run: set 2026-03-08T22:40:53.774 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:34: run: sed -n -e 's/^\(TEST_[0-9a-z_]*\) .*/\1/p' 2026-03-08T22:40:53.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:34: run: local 'funcs=TEST_ec_profile_warning 2026-03-08T22:40:53.776 INFO:tasks.workunit.client.0.vm04.stderr:TEST_preload_no_warning 2026-03-08T22:40:53.776 INFO:tasks.workunit.client.0.vm04.stderr:TEST_preload_no_warning_default 2026-03-08T22:40:53.776 INFO:tasks.workunit.client.0.vm04.stderr:TEST_preload_warning' 2026-03-08T22:40:53.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:35: run: for func in $funcs 2026-03-08T22:40:53.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:36: run: TEST_ec_profile_warning td/test-erasure-code-plugins 2026-03-08T22:40:53.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:92: TEST_ec_profile_warning: local dir=td/test-erasure-code-plugins 2026-03-08T22:40:53.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:94: TEST_ec_profile_warning: setup td/test-erasure-code-plugins 2026-03-08T22:40:53.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-code-plugins 2026-03-08T22:40:53.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-code-plugins 2026-03-08T22:40:53.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:40:53.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:40:53.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:40:53.776 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:40:53.777 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:40:53.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:40:53.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:40:53.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:40:53.779 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:40:53.779 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:40:53.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:40:53.780 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:40:53.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:40:53.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:40:53.781 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:40:53.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:40:53.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:40:53.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:40:53.782 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:40:53.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:40:53.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:40:53.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:40:53.785 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:40:53.785 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:40:53.785 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:40:53.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:40:53.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:40:53.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:40:53.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-code-plugins 2026-03-08T22:40:53.787 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:40:53.787 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:40:53.787 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:40:53.787 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.50927 2026-03-08T22:40:53.788 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:40:53.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 1024 -le 1024 ']' 2026-03-08T22:40:53.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:136: setup: ulimit -n 4096 2026-03-08T22:40:53.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:40:53.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-code-plugins 1' TERM HUP INT 2026-03-08T22:40:53.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:95: TEST_ec_profile_warning: run_mon td/test-erasure-code-plugins a 2026-03-08T22:40:53.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-code-plugins 2026-03-08T22:40:53.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:40:53.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:40:53.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:40:53.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-code-plugins/a 2026-03-08T22:40:53.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-code-plugins/a --run-dir=td/test-erasure-code-plugins 2026-03-08T22:40:53.820 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:40:53.820 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:40:53.820 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:40:53.820 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:40:53.820 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:40:53.820 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:40:53.820 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:40:53.820 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-code-plugins/a '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-code-plugins/log --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:40:53.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:40:53.850 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:40:53.851 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:40:53.851 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:40:53.851 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:40:53.851 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:40:53.851 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:40:53.851 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:40:53.851 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:40:53.852 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:40:53.852 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:40:53.852 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:40:53.852 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:40:53.852 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:40:53.852 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get fsid 2026-03-08T22:40:53.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:40:53.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:40:53.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:40:53.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:40:53.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:40:53.906 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:40:53.906 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:40:53.906 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:40:53.906 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:40:53.906 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:40:53.906 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:40:53.906 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:40:53.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:40:53.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get mon_host 2026-03-08T22:40:53.955 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:96: TEST_ec_profile_warning: run_mgr td/test-erasure-code-plugins x 2026-03-08T22:40:53.956 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-code-plugins 2026-03-08T22:40:53.956 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:40:53.956 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:40:53.956 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:40:53.956 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-code-plugins/x 2026-03-08T22:40:53.956 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:40:54.074 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:40:54.074 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:40:54.074 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:40:54.074 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:40:54.074 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:40:54.074 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:40:54.074 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:40:54.074 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:40:54.075 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-code-plugins/x '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:40:54.097 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:97: TEST_ec_profile_warning: seq 0 2 2026-03-08T22:40:54.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:97: TEST_ec_profile_warning: for id in $(seq 0 2) 2026-03-08T22:40:54.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:98: TEST_ec_profile_warning: run_osd td/test-erasure-code-plugins 0 2026-03-08T22:40:54.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code-plugins 2026-03-08T22:40:54.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code-plugins/0 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 ' 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code-plugins/0' 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code-plugins/0/journal' 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code-plugins' 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code-plugins/$name.log' 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code-plugins/$name.pid' 2026-03-08T22:40:54.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:40:54.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:40:54.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:40:54.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:40:54.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:40:54.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:40:54.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code-plugins/0 2026-03-08T22:40:54.100 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:40:54.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=724f5d78-0a3a-44a5-a7cf-051ee05c7943 2026-03-08T22:40:54.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 724f5d78-0a3a-44a5-a7cf-051ee05c7943' 2026-03-08T22:40:54.101 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 724f5d78-0a3a-44a5-a7cf-051ee05c7943 2026-03-08T22:40:54.101 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:40:54.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQD2+q1pNontBhAAd076woqgadD2xA82QXn8IA== 2026-03-08T22:40:54.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQD2+q1pNontBhAAd076woqgadD2xA82QXn8IA=="}' 2026-03-08T22:40:54.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 724f5d78-0a3a-44a5-a7cf-051ee05c7943 -i td/test-erasure-code-plugins/0/new.json 2026-03-08T22:40:54.264 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:40:54.274 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code-plugins/0/new.json 2026-03-08T22:40:54.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQD2+q1pNontBhAAd076woqgadD2xA82QXn8IA== --osd-uuid 724f5d78-0a3a-44a5-a7cf-051ee05c7943 2026-03-08T22:40:54.295 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:40:54.296+0000 7fcfb0493780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:54.296 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:40:54.298+0000 7fcfb0493780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:54.300 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:40:54.301+0000 7fcfb0493780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:54.300 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:40:54.302+0000 7fcfb0493780 -1 bdev(0x561f199cdc00 td/test-erasure-code-plugins/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:40:54.300 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:40:54.302+0000 7fcfb0493780 -1 bluestore(td/test-erasure-code-plugins/0) _read_fsid unparsable uuid 2026-03-08T22:40:56.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code-plugins/0/keyring 2026-03-08T22:40:56.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:40:56.398 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:40:56.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:40:56.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code-plugins/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:40:56.520 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:40:56.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:40:56.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:40:56.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:40:56.522 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:40:56.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:40:56.566 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:40:56.565+0000 7f6cfcf84780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:56.570 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:40:56.572+0000 7f6cfcf84780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:56.572 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:40:56.574+0000 7f6cfcf84780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:40:56.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:40:56.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:40:56.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:40:56.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:40:56.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:40:56.677 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:40:56.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:40:56.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:40:56.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:40:56.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:40:56.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:40:57.635 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:40:57.637+0000 7f6cfcf84780 -1 Falling back to public interface 2026-03-08T22:40:57.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:40:57.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:40:57.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:40:57.909 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:40:57.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:40:57.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:40:58.147 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:40:58.518 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:40:58.520+0000 7f6cfcf84780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:40:59.149 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:40:59.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:40:59.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:40:59.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:40:59.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:40:59.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:40:59.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:00.374 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:41:00.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:00.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:00.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:41:00.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:00.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:00.682 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:00.773 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:00.775+0000 7f6cf8725640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:41:01.685 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-08T22:41:01.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:01.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:01.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:41:01.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:01.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:01.921 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3554008186,v1:127.0.0.1:6803/3554008186] [v2:127.0.0.1:6804/3554008186,v1:127.0.0.1:6805/3554008186] exists,up 724f5d78-0a3a-44a5-a7cf-051ee05c7943 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:97: TEST_ec_profile_warning: for id in $(seq 0 2) 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:98: TEST_ec_profile_warning: run_osd td/test-erasure-code-plugins 1 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code-plugins/1 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 ' 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code-plugins/1' 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code-plugins/1/journal' 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code-plugins' 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:41:01.922 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:01.923 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:01.923 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:01.923 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:01.923 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:01.923 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:01.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:01.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:41:01.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:41:01.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:41:01.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code-plugins/$name.log' 2026-03-08T22:41:01.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code-plugins/$name.pid' 2026-03-08T22:41:01.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:41:01.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:41:01.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:41:01.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:41:01.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:41:01.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:41:01.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code-plugins/1 2026-03-08T22:41:01.924 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:41:01.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=46644a31-b183-43d1-b539-60d4fdac021d 2026-03-08T22:41:01.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 46644a31-b183-43d1-b539-60d4fdac021d' 2026-03-08T22:41:01.925 INFO:tasks.workunit.client.0.vm04.stdout:add osd1 46644a31-b183-43d1-b539-60d4fdac021d 2026-03-08T22:41:01.925 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:41:01.938 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQD9+q1p8uIXOBAANWK2j6oBPyVkDpnyTkTFWg== 2026-03-08T22:41:01.939 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQD9+q1p8uIXOBAANWK2j6oBPyVkDpnyTkTFWg=="}' 2026-03-08T22:41:01.939 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 46644a31-b183-43d1-b539-60d4fdac021d -i td/test-erasure-code-plugins/1/new.json 2026-03-08T22:41:02.207 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:41:02.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code-plugins/1/new.json 2026-03-08T22:41:02.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/1 --osd-journal=td/test-erasure-code-plugins/1/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQD9+q1p8uIXOBAANWK2j6oBPyVkDpnyTkTFWg== --osd-uuid 46644a31-b183-43d1-b539-60d4fdac021d 2026-03-08T22:41:02.235 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:02.238+0000 7f5875c1a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:02.237 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:02.240+0000 7f5875c1a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:02.239 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:02.241+0000 7f5875c1a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:02.239 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:02.241+0000 7f5875c1a780 -1 bdev(0x55a698f35c00 td/test-erasure-code-plugins/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:41:02.239 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:02.241+0000 7f5875c1a780 -1 bluestore(td/test-erasure-code-plugins/1) _read_fsid unparsable uuid 2026-03-08T22:41:05.140 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code-plugins/1/keyring 2026-03-08T22:41:05.140 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:41:05.141 INFO:tasks.workunit.client.0.vm04.stdout:adding osd1 key to auth repository 2026-03-08T22:41:05.141 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:41:05.141 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code-plugins/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:41:05.444 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:41:05.444 INFO:tasks.workunit.client.0.vm04.stdout:start osd.1 2026-03-08T22:41:05.444 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/1 --osd-journal=td/test-erasure-code-plugins/1/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:41:05.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:41:05.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:41:05.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:41:05.463 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:05.465+0000 7f8f8e41a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:05.470 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:05.472+0000 7f8f8e41a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:05.472 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:05.473+0000 7f8f8e41a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:05.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:41:05.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:41:05.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:41:05.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:41:05.691 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:41:05.691 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:05.691 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:41:05.691 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:41:05.691 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:05.691 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:41:05.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:06.298 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:06.300+0000 7f8f8e41a780 -1 Falling back to public interface 2026-03-08T22:41:06.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:06.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:06.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:41:06.909 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:41:06.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:06.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:41:07.125 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:07.161 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:07.164+0000 7f8f8e41a780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:41:08.126 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:41:08.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:08.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:08.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:41:08.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:41:08.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:08.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:09.548 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:41:09.548 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:09.548 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:09.548 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:41:09.548 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:09.548 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:41:09.769 INFO:tasks.workunit.client.0.vm04.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/3774401424,v1:127.0.0.1:6811/3774401424] [v2:127.0.0.1:6812/3774401424,v1:127.0.0.1:6813/3774401424] exists,up 46644a31-b183-43d1-b539-60d4fdac021d 2026-03-08T22:41:09.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:41:09.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:41:09.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:41:09.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:97: TEST_ec_profile_warning: for id in $(seq 0 2) 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:98: TEST_ec_profile_warning: run_osd td/test-erasure-code-plugins 2 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code-plugins/2 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 ' 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code-plugins/2' 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code-plugins/2/journal' 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code-plugins' 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:09.770 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:09.771 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:09.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:09.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:41:09.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:41:09.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:41:09.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code-plugins/$name.log' 2026-03-08T22:41:09.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code-plugins/$name.pid' 2026-03-08T22:41:09.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:41:09.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:41:09.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:41:09.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:41:09.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:41:09.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:41:09.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code-plugins/2 2026-03-08T22:41:09.773 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:41:09.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=50312e51-3ca8-47a6-a00d-9ac08b767548 2026-03-08T22:41:09.774 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 50312e51-3ca8-47a6-a00d-9ac08b767548' 2026-03-08T22:41:09.774 INFO:tasks.workunit.client.0.vm04.stdout:add osd2 50312e51-3ca8-47a6-a00d-9ac08b767548 2026-03-08T22:41:09.774 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:41:09.787 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAF+61pKsMOLxAA/XTUo1SkdyDrSC55QV/YbA== 2026-03-08T22:41:09.787 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAF+61pKsMOLxAA/XTUo1SkdyDrSC55QV/YbA=="}' 2026-03-08T22:41:09.787 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 50312e51-3ca8-47a6-a00d-9ac08b767548 -i td/test-erasure-code-plugins/2/new.json 2026-03-08T22:41:10.009 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:41:10.020 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code-plugins/2/new.json 2026-03-08T22:41:10.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/2 --osd-journal=td/test-erasure-code-plugins/2/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAF+61pKsMOLxAA/XTUo1SkdyDrSC55QV/YbA== --osd-uuid 50312e51-3ca8-47a6-a00d-9ac08b767548 2026-03-08T22:41:10.040 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:10.043+0000 7f0041c11780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:10.042 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:10.045+0000 7f0041c11780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:10.043 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:10.046+0000 7f0041c11780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:10.044 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:10.047+0000 7f0041c11780 -1 bdev(0x557543927c00 td/test-erasure-code-plugins/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:41:10.044 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:10.047+0000 7f0041c11780 -1 bluestore(td/test-erasure-code-plugins/2) _read_fsid unparsable uuid 2026-03-08T22:41:12.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code-plugins/2/keyring 2026-03-08T22:41:12.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:41:12.790 INFO:tasks.workunit.client.0.vm04.stdout:adding osd2 key to auth repository 2026-03-08T22:41:12.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:41:12.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code-plugins/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:41:13.085 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:41:13.085 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T22:41:13.085 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/2 --osd-journal=td/test-erasure-code-plugins/2/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:41:13.085 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:41:13.086 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:41:13.088 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:41:13.103 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:13.106+0000 7fe7cb2ed780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:13.110 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:13.114+0000 7fe7cb2ed780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:13.111 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:13.114+0000 7fe7cb2ed780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:13.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:41:13.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:41:13.322 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:41:13.322 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:41:13.322 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:41:13.322 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:13.322 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:41:13.322 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:41:13.322 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:13.322 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:41:13.541 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:14.201 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:14.204+0000 7fe7cb2ed780 -1 Falling back to public interface 2026-03-08T22:41:14.542 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:14.542 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:14.542 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:41:14.542 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:41:14.543 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:14.543 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:41:14.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:15.298 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:15.301+0000 7fe7cb2ed780 -1 osd.2 0 log_to_monitors true 2026-03-08T22:41:15.784 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:41:15.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:15.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:15.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:41:15.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:15.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:41:16.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:16.454 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:16.457+0000 7fe7c6a8e640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T22:41:17.034 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:41:17.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:17.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:17.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:41:17.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:17.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:41:17.251 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/1849769148,v1:127.0.0.1:6819/1849769148] [v2:127.0.0.1:6820/1849769148,v1:127.0.0.1:6821/1849769148] exists,up 50312e51-3ca8-47a6-a00d-9ac08b767548 2026-03-08T22:41:17.251 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:41:17.251 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:41:17.251 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:41:17.251 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:100: TEST_ec_profile_warning: create_rbd_pool 2026-03-08T22:41:17.251 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T22:41:17.474 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd' does not exist 2026-03-08T22:41:17.485 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T22:41:17.485 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:41:17.743 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd' already exists 2026-03-08T22:41:17.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:41:18.753 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T22:41:19.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:101: TEST_ec_profile_warning: wait_for_clean 2026-03-08T22:41:19.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:41:19.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:41:19.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:41:19.057 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:41:19.058 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:41:19.058 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:41:19.058 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:41:19.058 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:41:19.058 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:41:19.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:41:19.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:41:19.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:41:19.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:41:19.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:41:19.129 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:41:19.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:41:19.353 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:41:19.353 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T22:41:19.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:41:19.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:41:19.353 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:41:19.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836486 2026-03-08T22:41:19.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836486 2026-03-08T22:41:19.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486' 2026-03-08T22:41:19.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:41:19.433 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:41:19.501 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672964 2026-03-08T22:41:19.501 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672964 2026-03-08T22:41:19.501 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-42949672964' 2026-03-08T22:41:19.501 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:41:19.501 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:41:19.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509443 2026-03-08T22:41:19.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509443 2026-03-08T22:41:19.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-42949672964 2-64424509443' 2026-03-08T22:41:19.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:41:19.575 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836486 2026-03-08T22:41:19.575 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:41:19.576 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:41:19.576 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836486 2026-03-08T22:41:19.576 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:41:19.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836486 2026-03-08T22:41:19.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836486' 2026-03-08T22:41:19.577 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 21474836486 2026-03-08T22:41:19.577 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:41:19.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836486 2026-03-08T22:41:19.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:41:20.848 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:41:20.848 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:41:21.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836486 2026-03-08T22:41:21.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:41:21.114 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672964 2026-03-08T22:41:21.115 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:41:21.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:41:21.116 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672964 2026-03-08T22:41:21.116 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:41:21.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672964 2026-03-08T22:41:21.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672964' 2026-03-08T22:41:21.117 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 42949672964 2026-03-08T22:41:21.117 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:41:21.336 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672965 -lt 42949672964 2026-03-08T22:41:21.336 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:41:21.337 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509443 2026-03-08T22:41:21.337 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:41:21.338 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:41:21.339 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509443 2026-03-08T22:41:21.339 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:41:21.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509443 2026-03-08T22:41:21.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509443' 2026-03-08T22:41:21.340 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 64424509443 2026-03-08T22:41:21.340 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:41:21.567 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509443 -lt 64424509443 2026-03-08T22:41:21.567 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:41:21.567 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:41:21.567 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:41:21.871 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T22:41:21.871 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:41:21.871 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:41:21.871 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:41:21.871 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:41:21.871 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:41:21.872 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:41:21.872 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:41:22.095 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T22:41:22.095 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:41:22.095 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:41:22.095 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:41:22.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T22:41:22.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:41:22.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:41:22.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:103: TEST_ec_profile_warning: for plugin in ${legacy_jerasure_plugins[*]} 2026-03-08T22:41:22.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:104: TEST_ec_profile_warning: ceph osd erasure-code-profile set prof-jerasure_generic crush-failure-domain=osd technique=reed_sol_van plugin=jerasure_generic 2026-03-08T22:41:22.712 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:105: TEST_ec_profile_warning: get_asok_path mon.a 2026-03-08T22:41:22.712 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:22.712 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:22.712 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:22.712 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:22.712 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:22.712 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:22.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:105: TEST_ec_profile_warning: CEPH_ARGS= 2026-03-08T22:41:22.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:105: TEST_ec_profile_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-mon.a.asok log flush 2026-03-08T22:41:22.757 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:41:22.762 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:106: TEST_ec_profile_warning: grep 'WARNING: erasure coding profile prof-jerasure_generic uses plugin jerasure_generic' td/test-erasure-code-plugins/mon.a.log 2026-03-08T22:41:22.765 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:41:22.565+0000 7f4660bfa640 0 mon.a@0(leader).osd e23 WARNING: erasure coding profile prof-jerasure_generic uses plugin jerasure_generic that has been deprecated. Please use jerasure instead. 2026-03-08T22:41:22.765 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:41:22.704+0000 7f4660bfa640 0 mon.a@0(leader).osd e24 WARNING: erasure coding profile prof-jerasure_generic uses plugin jerasure_generic that has been deprecated. Please use jerasure instead. 2026-03-08T22:41:22.765 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:41:22.704+0000 7f4660bfa640 0 mon.a@0(leader).osd e24 WARNING: erasure coding profile prof-jerasure_generic uses plugin jerasure_generic that has been deprecated. Please use jerasure instead. 2026-03-08T22:41:22.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:103: TEST_ec_profile_warning: for plugin in ${legacy_jerasure_plugins[*]} 2026-03-08T22:41:22.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:104: TEST_ec_profile_warning: ceph osd erasure-code-profile set prof-jerasure_sse3 crush-failure-domain=osd technique=reed_sol_van plugin=jerasure_sse3 2026-03-08T22:41:23.065 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:105: TEST_ec_profile_warning: get_asok_path mon.a 2026-03-08T22:41:23.065 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:23.065 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:23.065 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:23.065 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:23.065 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:23.065 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:23.066 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:105: TEST_ec_profile_warning: CEPH_ARGS= 2026-03-08T22:41:23.066 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:105: TEST_ec_profile_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-mon.a.asok log flush 2026-03-08T22:41:23.120 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:41:23.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:106: TEST_ec_profile_warning: grep 'WARNING: erasure coding profile prof-jerasure_sse3 uses plugin jerasure_sse3' td/test-erasure-code-plugins/mon.a.log 2026-03-08T22:41:23.128 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:41:22.942+0000 7f4660bfa640 0 mon.a@0(leader).osd e24 WARNING: erasure coding profile prof-jerasure_sse3 uses plugin jerasure_sse3 that has been deprecated. Please use jerasure instead. 2026-03-08T22:41:23.128 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:41:23.057+0000 7f4660bfa640 0 mon.a@0(leader).osd e25 WARNING: erasure coding profile prof-jerasure_sse3 uses plugin jerasure_sse3 that has been deprecated. Please use jerasure instead. 2026-03-08T22:41:23.128 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:41:23.057+0000 7f4660bfa640 0 mon.a@0(leader).osd e25 WARNING: erasure coding profile prof-jerasure_sse3 uses plugin jerasure_sse3 that has been deprecated. Please use jerasure instead. 2026-03-08T22:41:23.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:103: TEST_ec_profile_warning: for plugin in ${legacy_jerasure_plugins[*]} 2026-03-08T22:41:23.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:104: TEST_ec_profile_warning: ceph osd erasure-code-profile set prof-jerasure_sse4 crush-failure-domain=osd technique=reed_sol_van plugin=jerasure_sse4 2026-03-08T22:41:23.375 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:105: TEST_ec_profile_warning: get_asok_path mon.a 2026-03-08T22:41:23.375 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:23.375 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:23.375 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:23.375 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:23.375 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:23.375 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:23.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:105: TEST_ec_profile_warning: CEPH_ARGS= 2026-03-08T22:41:23.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:105: TEST_ec_profile_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-mon.a.asok log flush 2026-03-08T22:41:23.421 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:41:23.425 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:106: TEST_ec_profile_warning: grep 'WARNING: erasure coding profile prof-jerasure_sse4 uses plugin jerasure_sse4' td/test-erasure-code-plugins/mon.a.log 2026-03-08T22:41:23.428 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:41:23.294+0000 7f4660bfa640 0 mon.a@0(leader).osd e25 WARNING: erasure coding profile prof-jerasure_sse4 uses plugin jerasure_sse4 that has been deprecated. Please use jerasure instead. 2026-03-08T22:41:23.428 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:41:23.367+0000 7f4660bfa640 0 mon.a@0(leader).osd e26 WARNING: erasure coding profile prof-jerasure_sse4 uses plugin jerasure_sse4 that has been deprecated. Please use jerasure instead. 2026-03-08T22:41:23.428 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:41:23.368+0000 7f4660bfa640 0 mon.a@0(leader).osd e26 WARNING: erasure coding profile prof-jerasure_sse4 uses plugin jerasure_sse4 that has been deprecated. Please use jerasure instead. 2026-03-08T22:41:23.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:109: TEST_ec_profile_warning: for plugin in ${legacy_shec_plugins[*]} 2026-03-08T22:41:23.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:110: TEST_ec_profile_warning: ceph osd erasure-code-profile set prof-shec_generic crush-failure-domain=osd plugin=shec_generic 2026-03-08T22:41:23.722 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:111: TEST_ec_profile_warning: get_asok_path mon.a 2026-03-08T22:41:23.722 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:23.722 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:23.722 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:23.722 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:23.722 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:23.722 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:23.722 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:111: TEST_ec_profile_warning: CEPH_ARGS= 2026-03-08T22:41:23.722 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:111: TEST_ec_profile_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-mon.a.asok log flush 2026-03-08T22:41:23.772 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:41:23.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:112: TEST_ec_profile_warning: grep 'WARNING: erasure coding profile prof-shec_generic uses plugin shec_generic' td/test-erasure-code-plugins/mon.a.log 2026-03-08T22:41:23.780 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:41:23.596+0000 7f4660bfa640 0 mon.a@0(leader).osd e26 WARNING: erasure coding profile prof-shec_generic uses plugin shec_generic that has been deprecated. Please use shec instead. 2026-03-08T22:41:23.780 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:41:23.715+0000 7f4660bfa640 0 mon.a@0(leader).osd e27 WARNING: erasure coding profile prof-shec_generic uses plugin shec_generic that has been deprecated. Please use shec instead. 2026-03-08T22:41:23.780 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:41:23.715+0000 7f4660bfa640 0 mon.a@0(leader).osd e27 WARNING: erasure coding profile prof-shec_generic uses plugin shec_generic that has been deprecated. Please use shec instead. 2026-03-08T22:41:23.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:109: TEST_ec_profile_warning: for plugin in ${legacy_shec_plugins[*]} 2026-03-08T22:41:23.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:110: TEST_ec_profile_warning: ceph osd erasure-code-profile set prof-shec_sse3 crush-failure-domain=osd plugin=shec_sse3 2026-03-08T22:41:24.037 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:111: TEST_ec_profile_warning: get_asok_path mon.a 2026-03-08T22:41:24.038 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:24.038 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:24.038 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:24.038 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:24.038 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:24.038 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:24.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:111: TEST_ec_profile_warning: CEPH_ARGS= 2026-03-08T22:41:24.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:111: TEST_ec_profile_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-mon.a.asok log flush 2026-03-08T22:41:24.080 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:41:24.085 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:112: TEST_ec_profile_warning: grep 'WARNING: erasure coding profile prof-shec_sse3 uses plugin shec_sse3' td/test-erasure-code-plugins/mon.a.log 2026-03-08T22:41:24.088 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:41:23.950+0000 7f4660bfa640 0 mon.a@0(leader).osd e28 WARNING: erasure coding profile prof-shec_sse3 uses plugin shec_sse3 that has been deprecated. Please use shec instead. 2026-03-08T22:41:24.088 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:41:24.031+0000 7f4660bfa640 0 mon.a@0(leader).osd e29 WARNING: erasure coding profile prof-shec_sse3 uses plugin shec_sse3 that has been deprecated. Please use shec instead. 2026-03-08T22:41:24.088 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:41:24.031+0000 7f4660bfa640 0 mon.a@0(leader).osd e29 WARNING: erasure coding profile prof-shec_sse3 uses plugin shec_sse3 that has been deprecated. Please use shec instead. 2026-03-08T22:41:24.088 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:109: TEST_ec_profile_warning: for plugin in ${legacy_shec_plugins[*]} 2026-03-08T22:41:24.088 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:110: TEST_ec_profile_warning: ceph osd erasure-code-profile set prof-shec_sse4 crush-failure-domain=osd plugin=shec_sse4 2026-03-08T22:41:24.387 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:111: TEST_ec_profile_warning: get_asok_path mon.a 2026-03-08T22:41:24.387 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:24.387 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:24.388 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:24.388 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:24.388 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:24.388 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:24.388 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:111: TEST_ec_profile_warning: CEPH_ARGS= 2026-03-08T22:41:24.388 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:111: TEST_ec_profile_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-mon.a.asok log flush 2026-03-08T22:41:24.437 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:41:24.442 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:112: TEST_ec_profile_warning: grep 'WARNING: erasure coding profile prof-shec_sse4 uses plugin shec_sse4' td/test-erasure-code-plugins/mon.a.log 2026-03-08T22:41:24.445 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:41:24.256+0000 7f4660bfa640 0 mon.a@0(leader).osd e29 WARNING: erasure coding profile prof-shec_sse4 uses plugin shec_sse4 that has been deprecated. Please use shec instead. 2026-03-08T22:41:24.445 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:41:24.379+0000 7f4660bfa640 0 mon.a@0(leader).osd e30 WARNING: erasure coding profile prof-shec_sse4 uses plugin shec_sse4 that has been deprecated. Please use shec instead. 2026-03-08T22:41:24.445 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:41:24.379+0000 7f4660bfa640 0 mon.a@0(leader).osd e30 WARNING: erasure coding profile prof-shec_sse4 uses plugin shec_sse4 that has been deprecated. Please use shec instead. 2026-03-08T22:41:24.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:115: TEST_ec_profile_warning: teardown td/test-erasure-code-plugins 2026-03-08T22:41:24.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:24.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:41:24.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:41:24.446 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:41:24.446 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:41:24.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:41:24.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:41:24.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:41:24.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:41:24.565 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:41:24.566 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:41:24.566 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:41:24.567 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:41:24.567 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:41:24.568 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:41:24.568 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:24.569 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:41:24.569 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:41:24.569 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:24.570 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:41:24.571 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:41:24.571 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:41:24.584 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:41:24.584 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:24.584 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:24.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:41:24.586 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:41:24.586 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:41:24.586 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:35: run: for func in $funcs 2026-03-08T22:41:24.586 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:36: run: TEST_preload_no_warning td/test-erasure-code-plugins 2026-03-08T22:41:24.586 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:58: TEST_preload_no_warning: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:24.586 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:60: TEST_preload_no_warning: for plugin in ${plugins[*]} 2026-03-08T22:41:24.586 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:61: TEST_preload_no_warning: setup td/test-erasure-code-plugins 2026-03-08T22:41:24.586 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:24.586 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-code-plugins 2026-03-08T22:41:24.586 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:24.586 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:41:24.586 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:41:24.586 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:41:24.586 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:41:24.586 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:41:24.587 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:41:24.587 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:41:24.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:41:24.589 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:41:24.590 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:41:24.590 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:41:24.591 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:41:24.591 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:41:24.591 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:41:24.592 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:24.592 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:41:24.592 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:41:24.592 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:24.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:41:24.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:41:24.595 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:41:24.596 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:41:24.596 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:24.596 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:24.596 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:41:24.597 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:41:24.597 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:41:24.597 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-code-plugins 2026-03-08T22:41:24.599 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:41:24.599 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:24.599 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:24.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.50927 2026-03-08T22:41:24.600 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:41:24.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:41:24.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:41:24.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-code-plugins 1' TERM HUP INT 2026-03-08T22:41:24.601 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:62: TEST_preload_no_warning: run_mon td/test-erasure-code-plugins a --osd_erasure_code_plugins=jerasure 2026-03-08T22:41:24.601 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:24.601 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:41:24.601 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:41:24.601 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:41:24.601 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-code-plugins/a 2026-03-08T22:41:24.601 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-code-plugins/a --run-dir=td/test-erasure-code-plugins --osd_erasure_code_plugins=jerasure 2026-03-08T22:41:24.630 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:41:24.630 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:24.630 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:24.630 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:24.630 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:24.630 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:24.630 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:24.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-code-plugins/a '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-code-plugins/log --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_erasure_code_plugins=jerasure 2026-03-08T22:41:24.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:41:24.739 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:41:24.739 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:41:24.739 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:41:24.739 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:41:24.740 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:41:24.740 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:41:24.740 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:24.740 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:24.742 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:24.742 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:24.742 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:24.742 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:24.742 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:41:24.742 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get fsid 2026-03-08T22:41:24.798 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:41:24.798 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:41:24.798 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:41:24.798 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:41:24.799 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:41:24.799 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:41:24.799 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:24.799 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:24.800 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:24.800 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:24.800 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:24.800 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:24.800 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:41:24.800 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get mon_host 2026-03-08T22:41:24.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:63: TEST_preload_no_warning: run_mgr td/test-erasure-code-plugins x 2026-03-08T22:41:24.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:24.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:41:24.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:41:24.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:41:24.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-code-plugins/x 2026-03-08T22:41:24.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:41:24.978 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:41:24.978 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:24.978 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:24.979 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:24.979 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:24.979 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:24.979 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:24.979 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:41:24.980 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-code-plugins/x '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:41:25.003 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:64: TEST_preload_no_warning: get_asok_path mon.a 2026-03-08T22:41:25.003 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:25.003 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:25.003 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:25.004 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:25.004 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:25.004 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:25.004 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:64: TEST_preload_no_warning: CEPH_ARGS= 2026-03-08T22:41:25.004 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:64: TEST_preload_no_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-mon.a.asok log flush 2026-03-08T22:41:25.050 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:41:25.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:65: TEST_preload_no_warning: run_osd td/test-erasure-code-plugins 0 --osd_erasure_code_plugins=jerasure 2026-03-08T22:41:25.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:25.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:41:25.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:41:25.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:41:25.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code-plugins/0 2026-03-08T22:41:25.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 ' 2026-03-08T22:41:25.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:41:25.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:41:25.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:41:25.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code-plugins/0' 2026-03-08T22:41:25.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code-plugins/0/journal' 2026-03-08T22:41:25.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:41:25.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:41:25.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code-plugins' 2026-03-08T22:41:25.056 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:41:25.056 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:25.056 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:25.057 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:25.057 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:25.057 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:25.057 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:25.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:25.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:41:25.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:41:25.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:41:25.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code-plugins/$name.log' 2026-03-08T22:41:25.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code-plugins/$name.pid' 2026-03-08T22:41:25.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:41:25.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:41:25.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:41:25.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:41:25.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:41:25.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_erasure_code_plugins=jerasure 2026-03-08T22:41:25.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code-plugins/0 2026-03-08T22:41:25.058 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:41:25.059 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=befd8c10-1cfe-4afe-a63f-be5c45013b59 2026-03-08T22:41:25.059 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 befd8c10-1cfe-4afe-a63f-be5c45013b59' 2026-03-08T22:41:25.059 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 befd8c10-1cfe-4afe-a63f-be5c45013b59 2026-03-08T22:41:25.059 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:41:25.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAV+61pwcBqBBAA8mp0/nOqT+fNCqgvk6f/Nw== 2026-03-08T22:41:25.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAV+61pwcBqBBAA8mp0/nOqT+fNCqgvk6f/Nw=="}' 2026-03-08T22:41:25.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new befd8c10-1cfe-4afe-a63f-be5c45013b59 -i td/test-erasure-code-plugins/0/new.json 2026-03-08T22:41:25.210 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:41:25.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code-plugins/0/new.json 2026-03-08T22:41:25.221 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_erasure_code_plugins=jerasure --mkfs --key AQAV+61pwcBqBBAA8mp0/nOqT+fNCqgvk6f/Nw== --osd-uuid befd8c10-1cfe-4afe-a63f-be5c45013b59 2026-03-08T22:41:25.241 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:25.243+0000 7fd2c0ac8780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:25.243 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:25.246+0000 7fd2c0ac8780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:25.246 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:25.249+0000 7fd2c0ac8780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:25.247 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:25.249+0000 7fd2c0ac8780 -1 bdev(0x555b2b97a800 td/test-erasure-code-plugins/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:41:25.249 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:25.250+0000 7fd2c0ac8780 -1 bluestore(td/test-erasure-code-plugins/0) _read_fsid unparsable uuid 2026-03-08T22:41:27.496 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code-plugins/0/keyring 2026-03-08T22:41:27.496 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:41:27.497 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:41:27.497 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:41:27.497 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code-plugins/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:41:27.781 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:41:27.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:41:27.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_erasure_code_plugins=jerasure 2026-03-08T22:41:27.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:41:27.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:41:27.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:41:27.806 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:27.808+0000 7f568f2d9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:27.860 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:27.863+0000 7f568f2d9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:27.862 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:27.864+0000 7f568f2d9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:28.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:41:28.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:41:28.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:41:28.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:41:28.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:41:28.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:28.043 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:41:28.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:41:28.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:28.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:28.297 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:28.672 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:28.675+0000 7f568f2d9780 -1 Falling back to public interface 2026-03-08T22:41:29.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:29.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:29.298 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:41:29.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:41:29.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:29.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:29.605 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:29.712 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:29.715+0000 7f568f2d9780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:41:30.607 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:41:30.607 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:30.607 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:30.607 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:41:30.608 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:30.608 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:30.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:31.852 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:41:31.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:31.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:31.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:41:31.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:31.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:32.007 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:32.010+0000 7f568aa7a640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:41:32.093 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/309007976,v1:127.0.0.1:6803/309007976] [v2:127.0.0.1:6804/309007976,v1:127.0.0.1:6805/309007976] exists,up befd8c10-1cfe-4afe-a63f-be5c45013b59 2026-03-08T22:41:32.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:41:32.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:41:32.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:41:32.093 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:66: TEST_preload_no_warning: get_asok_path osd.0 2026-03-08T22:41:32.093 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:41:32.093 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:41:32.093 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:32.093 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:32.093 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:32.093 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-osd.0.asok 2026-03-08T22:41:32.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:66: TEST_preload_no_warning: CEPH_ARGS= 2026-03-08T22:41:32.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:66: TEST_preload_no_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-osd.0.asok log flush 2026-03-08T22:41:32.151 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:41:32.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:67: TEST_preload_no_warning: grep 'WARNING: osd_erasure_code_plugins contains plugin' td/test-erasure-code-plugins/mon.a.log 2026-03-08T22:41:32.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:68: TEST_preload_no_warning: grep 'WARNING: osd_erasure_code_plugins contains plugin' td/test-erasure-code-plugins/osd.0.log 2026-03-08T22:41:32.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:69: TEST_preload_no_warning: teardown td/test-erasure-code-plugins 2026-03-08T22:41:32.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:32.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:41:32.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:41:32.161 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:41:32.161 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:41:32.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:41:32.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:41:32.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:41:32.272 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:41:32.272 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:41:32.273 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:41:32.273 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:41:32.274 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:41:32.274 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:41:32.275 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:41:32.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:32.276 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:41:32.276 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:41:32.276 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:32.277 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:41:32.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:41:32.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:41:32.284 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:41:32.284 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:32.284 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:32.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:41:32.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:41:32.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:41:32.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:60: TEST_preload_no_warning: for plugin in ${plugins[*]} 2026-03-08T22:41:32.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:61: TEST_preload_no_warning: setup td/test-erasure-code-plugins 2026-03-08T22:41:32.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:32.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-code-plugins 2026-03-08T22:41:32.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:32.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:41:32.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:41:32.286 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:41:32.286 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:41:32.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:41:32.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:41:32.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:41:32.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:41:32.288 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:41:32.289 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:41:32.289 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:41:32.291 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:41:32.291 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:41:32.291 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:41:32.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:32.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:41:32.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:41:32.292 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:32.293 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:41:32.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:41:32.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:41:32.295 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:41:32.295 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:32.296 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:32.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:41:32.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:41:32.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:41:32.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-code-plugins 2026-03-08T22:41:32.298 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:41:32.298 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:32.298 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:32.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.50927 2026-03-08T22:41:32.299 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:41:32.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:41:32.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:41:32.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-code-plugins 1' TERM HUP INT 2026-03-08T22:41:32.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:62: TEST_preload_no_warning: run_mon td/test-erasure-code-plugins a --osd_erasure_code_plugins=shec 2026-03-08T22:41:32.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:32.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:41:32.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:41:32.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:41:32.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-code-plugins/a 2026-03-08T22:41:32.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-code-plugins/a --run-dir=td/test-erasure-code-plugins --osd_erasure_code_plugins=shec 2026-03-08T22:41:32.330 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:41:32.330 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:32.330 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:32.331 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:32.331 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:32.331 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:32.331 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:32.331 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-code-plugins/a '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-code-plugins/log --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_erasure_code_plugins=shec 2026-03-08T22:41:32.364 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:41:32.365 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:41:32.365 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:41:32.365 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:41:32.365 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:41:32.365 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:41:32.365 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:32.365 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:32.366 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:41:32.366 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:32.366 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:32.366 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:32.367 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:32.368 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:41:32.368 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get fsid 2026-03-08T22:41:32.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:41:32.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:41:32.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:41:32.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:41:32.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:41:32.423 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:41:32.423 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:32.423 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:32.424 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:32.424 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:32.424 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:32.424 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:32.424 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:41:32.424 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get mon_host 2026-03-08T22:41:32.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:63: TEST_preload_no_warning: run_mgr td/test-erasure-code-plugins x 2026-03-08T22:41:32.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:32.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:41:32.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:41:32.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:41:32.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-code-plugins/x 2026-03-08T22:41:32.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:41:32.597 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:41:32.597 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:32.597 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:32.597 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:32.597 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:32.597 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:32.597 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:32.598 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:41:32.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-code-plugins/x '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:41:32.622 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:64: TEST_preload_no_warning: get_asok_path mon.a 2026-03-08T22:41:32.622 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:32.622 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:32.624 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:32.624 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:32.624 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:32.624 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:32.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:64: TEST_preload_no_warning: CEPH_ARGS= 2026-03-08T22:41:32.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:64: TEST_preload_no_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-mon.a.asok log flush 2026-03-08T22:41:32.671 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:41:32.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:65: TEST_preload_no_warning: run_osd td/test-erasure-code-plugins 0 --osd_erasure_code_plugins=shec 2026-03-08T22:41:32.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:32.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:41:32.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:41:32.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:41:32.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code-plugins/0 2026-03-08T22:41:32.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 ' 2026-03-08T22:41:32.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:41:32.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:41:32.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:41:32.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code-plugins/0' 2026-03-08T22:41:32.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code-plugins/0/journal' 2026-03-08T22:41:32.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:41:32.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:41:32.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code-plugins' 2026-03-08T22:41:32.677 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:41:32.677 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:32.677 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:32.677 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:32.677 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:32.677 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:32.678 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:32.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:32.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:41:32.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:41:32.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:41:32.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code-plugins/$name.log' 2026-03-08T22:41:32.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code-plugins/$name.pid' 2026-03-08T22:41:32.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:41:32.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:41:32.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:41:32.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:41:32.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:41:32.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_erasure_code_plugins=shec 2026-03-08T22:41:32.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code-plugins/0 2026-03-08T22:41:32.678 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:41:32.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=45fc3274-3731-42e7-b0ee-ec0e5c872188 2026-03-08T22:41:32.679 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 45fc3274-3731-42e7-b0ee-ec0e5c872188 2026-03-08T22:41:32.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 45fc3274-3731-42e7-b0ee-ec0e5c872188' 2026-03-08T22:41:32.679 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:41:32.694 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAc+61pqM50KRAABFGyCpduQOfena6kS96/tg== 2026-03-08T22:41:32.694 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAc+61pqM50KRAABFGyCpduQOfena6kS96/tg=="}' 2026-03-08T22:41:32.694 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 45fc3274-3731-42e7-b0ee-ec0e5c872188 -i td/test-erasure-code-plugins/0/new.json 2026-03-08T22:41:32.827 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:41:32.836 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code-plugins/0/new.json 2026-03-08T22:41:32.836 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_erasure_code_plugins=shec --mkfs --key AQAc+61pqM50KRAABFGyCpduQOfena6kS96/tg== --osd-uuid 45fc3274-3731-42e7-b0ee-ec0e5c872188 2026-03-08T22:41:32.856 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:32.858+0000 7efccac5f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:32.861 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:32.864+0000 7efccac5f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:32.862 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:32.865+0000 7efccac5f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:32.862 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:32.865+0000 7efccac5f780 -1 bdev(0x55aafcbf4c00 td/test-erasure-code-plugins/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:41:32.862 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:32.865+0000 7efccac5f780 -1 bluestore(td/test-erasure-code-plugins/0) _read_fsid unparsable uuid 2026-03-08T22:41:34.961 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code-plugins/0/keyring 2026-03-08T22:41:34.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:41:34.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:41:34.962 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:41:34.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code-plugins/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:41:35.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:41:35.087 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:41:35.088 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_erasure_code_plugins=shec 2026-03-08T22:41:35.088 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:41:35.090 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:41:35.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:41:35.109 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:35.110+0000 7fce2c40f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:35.113 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:35.116+0000 7fce2c40f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:35.115 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:35.118+0000 7fce2c40f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:35.311 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:41:35.311 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:41:35.311 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:41:35.311 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:41:35.311 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:41:35.311 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:35.311 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:41:35.311 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:41:35.312 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:35.312 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:35.535 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:35.677 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:35.679+0000 7fce2c40f780 -1 Falling back to public interface 2026-03-08T22:41:36.534 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:36.537+0000 7fce2c40f780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:41:36.536 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:41:36.536 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:36.536 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:36.536 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:41:36.536 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:36.537 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:36.768 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:37.770 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:41:37.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:37.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:37.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:41:37.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:37.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:37.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:39.000 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:41:39.000 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:39.000 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:39.000 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:41:39.000 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:39.000 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:39.220 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/2513157605,v1:127.0.0.1:6803/2513157605] [v2:127.0.0.1:6804/2513157605,v1:127.0.0.1:6805/2513157605] exists,up 45fc3274-3731-42e7-b0ee-ec0e5c872188 2026-03-08T22:41:39.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:41:39.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:41:39.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:41:39.220 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:66: TEST_preload_no_warning: get_asok_path osd.0 2026-03-08T22:41:39.221 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:41:39.221 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:41:39.221 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:39.221 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:39.221 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:39.221 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-osd.0.asok 2026-03-08T22:41:39.221 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:66: TEST_preload_no_warning: CEPH_ARGS= 2026-03-08T22:41:39.221 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:66: TEST_preload_no_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-osd.0.asok log flush 2026-03-08T22:41:39.275 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:41:39.281 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:67: TEST_preload_no_warning: grep 'WARNING: osd_erasure_code_plugins contains plugin' td/test-erasure-code-plugins/mon.a.log 2026-03-08T22:41:39.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:68: TEST_preload_no_warning: grep 'WARNING: osd_erasure_code_plugins contains plugin' td/test-erasure-code-plugins/osd.0.log 2026-03-08T22:41:39.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:69: TEST_preload_no_warning: teardown td/test-erasure-code-plugins 2026-03-08T22:41:39.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:39.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:41:39.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:41:39.285 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:41:39.285 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:41:39.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:41:39.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:41:39.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:41:39.404 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:41:39.404 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:41:39.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:41:39.405 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:41:39.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:41:39.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:41:39.406 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:41:39.408 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:39.408 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:41:39.408 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:41:39.408 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:39.408 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:41:39.409 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:41:39.409 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:41:39.414 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:41:39.414 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:39.414 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:39.414 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:41:39.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:41:39.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:41:39.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:60: TEST_preload_no_warning: for plugin in ${plugins[*]} 2026-03-08T22:41:39.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:61: TEST_preload_no_warning: setup td/test-erasure-code-plugins 2026-03-08T22:41:39.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:39.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-code-plugins 2026-03-08T22:41:39.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:39.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:41:39.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:41:39.416 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:41:39.416 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:41:39.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:41:39.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:41:39.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:41:39.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:41:39.417 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:41:39.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:41:39.418 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:41:39.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:41:39.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:41:39.419 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:41:39.420 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:39.420 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:41:39.420 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:41:39.420 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:39.421 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:41:39.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:41:39.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:41:39.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:41:39.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:39.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:39.423 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:41:39.424 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:41:39.424 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:41:39.424 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-code-plugins 2026-03-08T22:41:39.425 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:41:39.425 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:39.425 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:39.425 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.50927 2026-03-08T22:41:39.426 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:41:39.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:41:39.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:41:39.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-code-plugins 1' TERM HUP INT 2026-03-08T22:41:39.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:62: TEST_preload_no_warning: run_mon td/test-erasure-code-plugins a --osd_erasure_code_plugins=lrc 2026-03-08T22:41:39.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:39.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:41:39.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:41:39.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:41:39.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-code-plugins/a 2026-03-08T22:41:39.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-code-plugins/a --run-dir=td/test-erasure-code-plugins --osd_erasure_code_plugins=lrc 2026-03-08T22:41:39.452 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:41:39.452 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:39.452 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:39.452 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:39.452 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:39.452 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:39.452 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:39.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-code-plugins/a '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-code-plugins/log --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_erasure_code_plugins=lrc 2026-03-08T22:41:39.483 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:41:39.484 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:41:39.484 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:41:39.484 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:41:39.484 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:41:39.484 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:41:39.484 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:39.484 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:39.484 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:41:39.486 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:39.486 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:39.486 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:39.486 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:39.486 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:41:39.486 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get fsid 2026-03-08T22:41:39.542 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:41:39.542 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:41:39.542 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:41:39.542 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:41:39.545 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:41:39.546 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:41:39.546 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:39.546 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:39.546 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:39.546 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:39.546 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:39.546 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:39.547 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:41:39.547 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get mon_host 2026-03-08T22:41:39.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:63: TEST_preload_no_warning: run_mgr td/test-erasure-code-plugins x 2026-03-08T22:41:39.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:39.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:41:39.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:41:39.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:41:39.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-code-plugins/x 2026-03-08T22:41:39.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:41:39.724 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:41:39.724 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:39.724 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:39.725 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:39.725 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:39.725 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:39.725 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:39.725 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:41:39.726 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-code-plugins/x '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:41:39.750 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:64: TEST_preload_no_warning: get_asok_path mon.a 2026-03-08T22:41:39.750 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:39.750 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:39.750 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:39.750 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:39.750 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:39.750 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:39.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:64: TEST_preload_no_warning: CEPH_ARGS= 2026-03-08T22:41:39.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:64: TEST_preload_no_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-mon.a.asok log flush 2026-03-08T22:41:39.801 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:41:39.807 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:65: TEST_preload_no_warning: run_osd td/test-erasure-code-plugins 0 --osd_erasure_code_plugins=lrc 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code-plugins/0 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 ' 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code-plugins/0' 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code-plugins/0/journal' 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code-plugins' 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:39.808 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:39.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:39.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:41:39.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:41:39.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:41:39.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code-plugins/$name.log' 2026-03-08T22:41:39.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code-plugins/$name.pid' 2026-03-08T22:41:39.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:41:39.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:41:39.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:41:39.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:41:39.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:41:39.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_erasure_code_plugins=lrc 2026-03-08T22:41:39.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code-plugins/0 2026-03-08T22:41:39.810 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:41:39.811 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=00445e1c-75ea-4ae3-a410-297fa6f3f86e 2026-03-08T22:41:39.811 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 00445e1c-75ea-4ae3-a410-297fa6f3f86e' 2026-03-08T22:41:39.811 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 00445e1c-75ea-4ae3-a410-297fa6f3f86e 2026-03-08T22:41:39.811 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:41:39.824 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAj+61p+/07MRAAMlG9v4fppQ41MP0QtiP2Og== 2026-03-08T22:41:39.824 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAj+61p+/07MRAAMlG9v4fppQ41MP0QtiP2Og=="}' 2026-03-08T22:41:39.825 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 00445e1c-75ea-4ae3-a410-297fa6f3f86e -i td/test-erasure-code-plugins/0/new.json 2026-03-08T22:41:39.959 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:41:39.969 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code-plugins/0/new.json 2026-03-08T22:41:39.970 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_erasure_code_plugins=lrc --mkfs --key AQAj+61p+/07MRAAMlG9v4fppQ41MP0QtiP2Og== --osd-uuid 00445e1c-75ea-4ae3-a410-297fa6f3f86e 2026-03-08T22:41:39.993 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:39.995+0000 7f2137211780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:39.994 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:39.997+0000 7f2137211780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:39.995 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:39.999+0000 7f2137211780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:39.996 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:39.999+0000 7f2137211780 -1 bdev(0x55fa13801c00 td/test-erasure-code-plugins/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:41:39.996 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:39.999+0000 7f2137211780 -1 bluestore(td/test-erasure-code-plugins/0) _read_fsid unparsable uuid 2026-03-08T22:41:42.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code-plugins/0/keyring 2026-03-08T22:41:42.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:41:42.099 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:41:42.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:41:42.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code-plugins/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:41:42.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:41:42.418 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:41:42.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_erasure_code_plugins=lrc 2026-03-08T22:41:42.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:41:42.420 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:41:42.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:41:42.438 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:42.441+0000 7f30fee12780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:42.447 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:42.450+0000 7f30fee12780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:42.448 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:42.451+0000 7f30fee12780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:42.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:41:42.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:41:42.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:41:42.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:41:42.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:41:42.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:42.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:41:42.662 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:41:42.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:42.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:43.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:43.265 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:43.268+0000 7f30fee12780 -1 Falling back to public interface 2026-03-08T22:41:44.119 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:41:44.119 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:44.119 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:44.119 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:41:44.119 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:44.119 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:44.124 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:44.127+0000 7f30fee12780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:41:44.350 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:45.351 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:41:45.351 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:45.351 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:45.351 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:41:45.351 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:45.351 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:45.591 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:45.659 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:45.662+0000 7f30f9ee6640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:41:46.593 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:41:46.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:46.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:46.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:41:46.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:46.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:46.828 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3792491207,v1:127.0.0.1:6803/3792491207] [v2:127.0.0.1:6804/3792491207,v1:127.0.0.1:6805/3792491207] exists,up 00445e1c-75ea-4ae3-a410-297fa6f3f86e 2026-03-08T22:41:46.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:41:46.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:41:46.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:41:46.829 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:66: TEST_preload_no_warning: get_asok_path osd.0 2026-03-08T22:41:46.829 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:41:46.829 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:41:46.830 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:46.830 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:46.830 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:46.830 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-osd.0.asok 2026-03-08T22:41:46.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:66: TEST_preload_no_warning: CEPH_ARGS= 2026-03-08T22:41:46.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:66: TEST_preload_no_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-osd.0.asok log flush 2026-03-08T22:41:46.883 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:41:46.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:67: TEST_preload_no_warning: grep 'WARNING: osd_erasure_code_plugins contains plugin' td/test-erasure-code-plugins/mon.a.log 2026-03-08T22:41:46.892 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:68: TEST_preload_no_warning: grep 'WARNING: osd_erasure_code_plugins contains plugin' td/test-erasure-code-plugins/osd.0.log 2026-03-08T22:41:46.893 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:69: TEST_preload_no_warning: teardown td/test-erasure-code-plugins 2026-03-08T22:41:46.893 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:46.893 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:41:46.893 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:41:46.893 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:41:46.893 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:41:46.894 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:41:46.894 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:41:46.894 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:41:47.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:41:47.009 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:41:47.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:41:47.010 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:41:47.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:41:47.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:41:47.011 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:41:47.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:47.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:41:47.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:41:47.013 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:47.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:41:47.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:41:47.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:41:47.017 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:41:47.017 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:47.017 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:47.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:41:47.020 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:41:47.020 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:41:47.020 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:60: TEST_preload_no_warning: for plugin in ${plugins[*]} 2026-03-08T22:41:47.020 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:61: TEST_preload_no_warning: setup td/test-erasure-code-plugins 2026-03-08T22:41:47.020 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:47.020 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-code-plugins 2026-03-08T22:41:47.020 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:47.020 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:41:47.020 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:41:47.020 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:41:47.020 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:41:47.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:41:47.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:41:47.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:41:47.022 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:41:47.023 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:41:47.023 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:41:47.024 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:41:47.025 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:41:47.025 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:41:47.025 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:41:47.026 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:47.026 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:41:47.026 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:41:47.026 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:47.027 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:41:47.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:41:47.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:41:47.029 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:41:47.029 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:47.029 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:47.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:41:47.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:41:47.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:41:47.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-code-plugins 2026-03-08T22:41:47.031 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:41:47.031 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:47.031 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:47.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.50927 2026-03-08T22:41:47.032 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:41:47.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:41:47.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:41:47.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-code-plugins 1' TERM HUP INT 2026-03-08T22:41:47.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:62: TEST_preload_no_warning: run_mon td/test-erasure-code-plugins a --osd_erasure_code_plugins=isa 2026-03-08T22:41:47.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:47.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:41:47.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:41:47.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:41:47.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-code-plugins/a 2026-03-08T22:41:47.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-code-plugins/a --run-dir=td/test-erasure-code-plugins --osd_erasure_code_plugins=isa 2026-03-08T22:41:47.056 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:41:47.056 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:47.056 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:47.056 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:47.056 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:47.056 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:47.056 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:47.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-code-plugins/a '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-code-plugins/log --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_erasure_code_plugins=isa 2026-03-08T22:41:47.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:41:47.086 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:41:47.086 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:41:47.086 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:41:47.086 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:41:47.086 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:41:47.086 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:41:47.086 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:47.086 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:47.087 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:47.087 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:47.087 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:47.088 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:47.088 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:41:47.088 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get fsid 2026-03-08T22:41:47.138 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:41:47.138 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:41:47.138 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:41:47.138 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:41:47.139 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:41:47.139 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:41:47.139 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:47.139 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:47.139 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:47.139 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:47.139 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:47.139 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:47.140 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:41:47.141 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get mon_host 2026-03-08T22:41:47.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:63: TEST_preload_no_warning: run_mgr td/test-erasure-code-plugins x 2026-03-08T22:41:47.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:47.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:41:47.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:41:47.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:41:47.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-code-plugins/x 2026-03-08T22:41:47.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:41:47.315 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:41:47.316 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:47.316 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:47.316 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:47.316 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:47.316 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:47.316 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:47.316 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:41:47.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-code-plugins/x '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:41:47.340 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:64: TEST_preload_no_warning: get_asok_path mon.a 2026-03-08T22:41:47.340 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:47.340 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:47.340 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:47.340 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:47.340 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:47.340 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:47.342 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:64: TEST_preload_no_warning: CEPH_ARGS= 2026-03-08T22:41:47.343 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:64: TEST_preload_no_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-mon.a.asok log flush 2026-03-08T22:41:47.388 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:41:47.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:65: TEST_preload_no_warning: run_osd td/test-erasure-code-plugins 0 --osd_erasure_code_plugins=isa 2026-03-08T22:41:47.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:47.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:41:47.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:41:47.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:41:47.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code-plugins/0 2026-03-08T22:41:47.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 ' 2026-03-08T22:41:47.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:41:47.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:41:47.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:41:47.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code-plugins/0' 2026-03-08T22:41:47.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code-plugins/0/journal' 2026-03-08T22:41:47.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:41:47.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:41:47.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code-plugins' 2026-03-08T22:41:47.393 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:41:47.393 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:47.393 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:47.394 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:47.394 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:47.394 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:47.394 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:47.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:47.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:41:47.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:41:47.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:41:47.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code-plugins/$name.log' 2026-03-08T22:41:47.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code-plugins/$name.pid' 2026-03-08T22:41:47.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:41:47.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:41:47.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:41:47.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:41:47.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:41:47.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_erasure_code_plugins=isa 2026-03-08T22:41:47.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code-plugins/0 2026-03-08T22:41:47.395 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:41:47.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=d6eb6860-932d-4ebf-817f-2298069fb44f 2026-03-08T22:41:47.396 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 d6eb6860-932d-4ebf-817f-2298069fb44f 2026-03-08T22:41:47.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 d6eb6860-932d-4ebf-817f-2298069fb44f' 2026-03-08T22:41:47.396 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:41:47.409 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAr+61pl7Z5GBAAOY12lJLBYlnCxTyJmIKjqQ== 2026-03-08T22:41:47.409 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAr+61pl7Z5GBAAOY12lJLBYlnCxTyJmIKjqQ=="}' 2026-03-08T22:41:47.409 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new d6eb6860-932d-4ebf-817f-2298069fb44f -i td/test-erasure-code-plugins/0/new.json 2026-03-08T22:41:47.523 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:41:47.532 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code-plugins/0/new.json 2026-03-08T22:41:47.533 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_erasure_code_plugins=isa --mkfs --key AQAr+61pl7Z5GBAAOY12lJLBYlnCxTyJmIKjqQ== --osd-uuid d6eb6860-932d-4ebf-817f-2298069fb44f 2026-03-08T22:41:47.554 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:47.556+0000 7f5cf9613780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:47.555 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:47.558+0000 7f5cf9613780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:47.556 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:47.559+0000 7f5cf9613780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:47.556 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:47.559+0000 7f5cf9613780 -1 bdev(0x555a8a07c800 td/test-erasure-code-plugins/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:41:47.556 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:47.559+0000 7f5cf9613780 -1 bluestore(td/test-erasure-code-plugins/0) _read_fsid unparsable uuid 2026-03-08T22:41:49.723 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code-plugins/0/keyring 2026-03-08T22:41:49.723 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:41:49.723 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:41:49.724 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:41:49.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code-plugins/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:41:50.022 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:41:50.022 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:41:50.022 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_erasure_code_plugins=isa 2026-03-08T22:41:50.022 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:41:50.023 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:41:50.025 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:41:50.041 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:50.043+0000 7f5bee221780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:50.042 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:50.045+0000 7f5bee221780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:50.044 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:50.046+0000 7f5bee221780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:50.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:41:50.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:41:50.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:41:50.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:41:50.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:41:50.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:50.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:41:50.246 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:41:50.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:50.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:50.467 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:51.108 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:51.111+0000 7f5bee221780 -1 Falling back to public interface 2026-03-08T22:41:51.468 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:51.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:51.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:41:51.469 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:41:51.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:51.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:51.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:51.977 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:51.980+0000 7f5bee221780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:41:52.691 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:52.691 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:52.692 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:41:52.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:41:52.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:52.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:52.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:53.925 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:41:53.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:53.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:53.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:41:53.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:53.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:54.228 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/1069610145,v1:127.0.0.1:6803/1069610145] [v2:127.0.0.1:6804/1069610145,v1:127.0.0.1:6805/1069610145] exists,up d6eb6860-932d-4ebf-817f-2298069fb44f 2026-03-08T22:41:54.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:41:54.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:41:54.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:41:54.228 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:66: TEST_preload_no_warning: get_asok_path osd.0 2026-03-08T22:41:54.228 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:41:54.228 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:41:54.228 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:54.229 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.229 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:54.229 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-osd.0.asok 2026-03-08T22:41:54.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:66: TEST_preload_no_warning: CEPH_ARGS= 2026-03-08T22:41:54.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:66: TEST_preload_no_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-osd.0.asok log flush 2026-03-08T22:41:54.282 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:41:54.291 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:67: TEST_preload_no_warning: grep 'WARNING: osd_erasure_code_plugins contains plugin' td/test-erasure-code-plugins/mon.a.log 2026-03-08T22:41:54.293 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:68: TEST_preload_no_warning: grep 'WARNING: osd_erasure_code_plugins contains plugin' td/test-erasure-code-plugins/osd.0.log 2026-03-08T22:41:54.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:69: TEST_preload_no_warning: teardown td/test-erasure-code-plugins 2026-03-08T22:41:54.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:54.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:41:54.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:41:54.294 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:41:54.294 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:41:54.295 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:41:54.295 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:41:54.295 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:41:54.413 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:41:54.414 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:41:54.414 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:41:54.414 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:41:54.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:41:54.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:41:54.415 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:41:54.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:54.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:41:54.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:41:54.417 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:54.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:41:54.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:41:54.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:41:54.424 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:41:54.424 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.424 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:54.425 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:41:54.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:41:54.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:41:54.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:72: TEST_preload_no_warning: return 0 2026-03-08T22:41:54.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:35: run: for func in $funcs 2026-03-08T22:41:54.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:36: run: TEST_preload_no_warning_default td/test-erasure-code-plugins 2026-03-08T22:41:54.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:76: TEST_preload_no_warning_default: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:54.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:78: TEST_preload_no_warning_default: setup td/test-erasure-code-plugins 2026-03-08T22:41:54.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:54.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-code-plugins 2026-03-08T22:41:54.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:54.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:41:54.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:41:54.426 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:41:54.426 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:41:54.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:41:54.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:41:54.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:41:54.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:41:54.428 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:41:54.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:41:54.429 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:41:54.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:41:54.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:41:54.430 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:41:54.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:54.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:41:54.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:41:54.431 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:54.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:41:54.433 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:41:54.433 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:41:54.434 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:41:54.434 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.434 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:54.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:41:54.435 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:41:54.435 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:41:54.435 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-code-plugins 2026-03-08T22:41:54.436 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:41:54.436 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.436 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:54.436 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.50927 2026-03-08T22:41:54.437 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:41:54.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:41:54.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:41:54.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-code-plugins 1' TERM HUP INT 2026-03-08T22:41:54.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:79: TEST_preload_no_warning_default: run_mon td/test-erasure-code-plugins a 2026-03-08T22:41:54.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:54.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:41:54.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:41:54.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:41:54.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-code-plugins/a 2026-03-08T22:41:54.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-code-plugins/a --run-dir=td/test-erasure-code-plugins 2026-03-08T22:41:54.461 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:41:54.461 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:54.461 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:54.461 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:54.461 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.462 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:54.462 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:54.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-code-plugins/a '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-code-plugins/log --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:41:54.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:41:54.492 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:41:54.492 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:41:54.492 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:41:54.493 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:41:54.493 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:41:54.495 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:41:54.495 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:54.495 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:54.496 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:54.496 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.496 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:54.496 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:54.497 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:41:54.497 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get fsid 2026-03-08T22:41:54.553 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:41:54.553 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:41:54.553 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:41:54.553 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:41:54.553 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:41:54.553 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:41:54.553 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:54.553 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:54.553 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:54.553 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.553 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:54.553 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:54.554 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:41:54.554 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get mon_host 2026-03-08T22:41:54.607 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:80: TEST_preload_no_warning_default: get_asok_path mon.a 2026-03-08T22:41:54.607 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:54.607 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:54.607 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:54.607 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.607 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:54.607 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:41:54.607 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:80: TEST_preload_no_warning_default: CEPH_ARGS= 2026-03-08T22:41:54.607 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:80: TEST_preload_no_warning_default: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-mon.a.asok log flush 2026-03-08T22:41:54.650 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:41:54.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:81: TEST_preload_no_warning_default: run_mgr td/test-erasure-code-plugins x 2026-03-08T22:41:54.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:54.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:41:54.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:41:54.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:41:54.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-code-plugins/x 2026-03-08T22:41:54.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:41:54.771 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:41:54.771 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:54.771 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:54.771 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:54.771 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.771 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:54.772 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:54.772 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:41:54.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-code-plugins/x '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:41:54.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:82: TEST_preload_no_warning_default: run_osd td/test-erasure-code-plugins 0 2026-03-08T22:41:54.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code-plugins 2026-03-08T22:41:54.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:41:54.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:41:54.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:41:54.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code-plugins/0 2026-03-08T22:41:54.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 ' 2026-03-08T22:41:54.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:41:54.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:41:54.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:41:54.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code-plugins/0' 2026-03-08T22:41:54.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code-plugins/0/journal' 2026-03-08T22:41:54.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:41:54.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:41:54.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code-plugins' 2026-03-08T22:41:54.795 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:41:54.795 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:54.795 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:54.795 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:54.795 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.795 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:41:54.795 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:54.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:41:54.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:41:54.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:41:54.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:41:54.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code-plugins/$name.log' 2026-03-08T22:41:54.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code-plugins/$name.pid' 2026-03-08T22:41:54.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:41:54.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:41:54.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:41:54.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:41:54.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:41:54.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:41:54.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code-plugins/0 2026-03-08T22:41:54.797 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:41:54.798 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=eef8f093-de61-4fe8-8734-2df691d5c0a7 2026-03-08T22:41:54.798 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 eef8f093-de61-4fe8-8734-2df691d5c0a7' 2026-03-08T22:41:54.798 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 eef8f093-de61-4fe8-8734-2df691d5c0a7 2026-03-08T22:41:54.798 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:41:54.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAy+61pCVJnMBAAGXnDYA3F3fk5rW8n3mTUeA== 2026-03-08T22:41:54.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAy+61pCVJnMBAAGXnDYA3F3fk5rW8n3mTUeA=="}' 2026-03-08T22:41:54.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new eef8f093-de61-4fe8-8734-2df691d5c0a7 -i td/test-erasure-code-plugins/0/new.json 2026-03-08T22:41:54.938 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:41:54.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code-plugins/0/new.json 2026-03-08T22:41:54.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAy+61pCVJnMBAAGXnDYA3F3fk5rW8n3mTUeA== --osd-uuid eef8f093-de61-4fe8-8734-2df691d5c0a7 2026-03-08T22:41:54.971 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:54.973+0000 7f71b421e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:54.974 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:54.977+0000 7f71b421e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:54.976 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:54.978+0000 7f71b421e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:54.976 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:54.978+0000 7f71b421e780 -1 bdev(0x55c44f64bc00 td/test-erasure-code-plugins/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:41:54.976 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:54.978+0000 7f71b421e780 -1 bluestore(td/test-erasure-code-plugins/0) _read_fsid unparsable uuid 2026-03-08T22:41:57.072 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code-plugins/0/keyring 2026-03-08T22:41:57.073 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:41:57.073 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:41:57.073 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:41:57.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code-plugins/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:41:57.200 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:41:57.200 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:41:57.200 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:41:57.200 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:41:57.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:41:57.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:41:57.216 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:57.219+0000 7f982843f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:57.218 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:57.221+0000 7f982843f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:57.219 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:57.222+0000 7f982843f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:57.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:41:57.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:41:57.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:41:57.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:41:57.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:41:57.423 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:57.423 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:41:57.423 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:41:57.423 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:57.423 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:57.653 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:58.558 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:58.560+0000 7f982843f780 -1 Falling back to public interface 2026-03-08T22:41:58.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:58.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:58.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:41:58.654 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:41:58.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:58.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:58.868 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:59.392 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:41:59.395+0000 7f982843f780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:41:59.870 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:59.870 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:59.870 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:41:59.870 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:41:59.870 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:59.870 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:00.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:01.140 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:42:01.140 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:01.140 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:01.140 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:42:01.140 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:01.140 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:01.392 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:02.394 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-08T22:42:02.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:02.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:02.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:42:02.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:02.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:02.618 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/964835539,v1:127.0.0.1:6803/964835539] [v2:127.0.0.1:6804/964835539,v1:127.0.0.1:6805/964835539] exists,up eef8f093-de61-4fe8-8734-2df691d5c0a7 2026-03-08T22:42:02.618 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:42:02.618 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:42:02.618 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:42:02.618 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:83: TEST_preload_no_warning_default: get_asok_path osd.0 2026-03-08T22:42:02.618 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:02.618 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:02.619 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:02.619 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:02.619 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:02.619 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-osd.0.asok 2026-03-08T22:42:02.619 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:83: TEST_preload_no_warning_default: CEPH_ARGS= 2026-03-08T22:42:02.619 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:83: TEST_preload_no_warning_default: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-osd.0.asok log flush 2026-03-08T22:42:02.669 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:42:02.673 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:84: TEST_preload_no_warning_default: grep 'WARNING: osd_erasure_code_plugins' td/test-erasure-code-plugins/mon.a.log 2026-03-08T22:42:02.674 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:85: TEST_preload_no_warning_default: grep 'WARNING: osd_erasure_code_plugins' td/test-erasure-code-plugins/osd.0.log 2026-03-08T22:42:02.675 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:86: TEST_preload_no_warning_default: teardown td/test-erasure-code-plugins 2026-03-08T22:42:02.675 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:02.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:02.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:42:02.676 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:02.676 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:02.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:02.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:02.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:02.792 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:02.792 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:02.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:02.793 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:02.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:02.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:02.794 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:02.795 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:02.795 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:02.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:02.796 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:02.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:02.797 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:02.797 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:42:02.802 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:02.802 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:02.802 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:02.802 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:42:02.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:02.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:02.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:88: TEST_preload_no_warning_default: return 0 2026-03-08T22:42:02.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:35: run: for func in $funcs 2026-03-08T22:42:02.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:36: run: TEST_preload_warning td/test-erasure-code-plugins 2026-03-08T22:42:02.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:41: TEST_preload_warning: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:02.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:43: TEST_preload_warning: for plugin in ${legacy_jerasure_plugins[*]} ${legacy_shec_plugins[*]} 2026-03-08T22:42:02.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:44: TEST_preload_warning: setup td/test-erasure-code-plugins 2026-03-08T22:42:02.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:02.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-code-plugins 2026-03-08T22:42:02.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:02.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:02.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:42:02.804 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:02.804 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:02.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:02.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:02.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:02.806 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:02.806 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:02.807 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:02.807 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:02.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:02.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:02.808 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:02.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:02.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:02.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:02.809 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:02.810 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:02.811 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:02.811 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:42:02.812 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:02.812 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:02.812 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:02.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:42:02.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:02.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:02.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-code-plugins 2026-03-08T22:42:02.815 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:42:02.815 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:02.815 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:02.815 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.50927 2026-03-08T22:42:02.816 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:42:02.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:42:02.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:42:02.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-code-plugins 1' TERM HUP INT 2026-03-08T22:42:02.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:45: TEST_preload_warning: run_mon td/test-erasure-code-plugins a --osd_erasure_code_plugins=jerasure_generic 2026-03-08T22:42:02.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:02.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:42:02.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:42:02.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:42:02.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-code-plugins/a 2026-03-08T22:42:02.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-code-plugins/a --run-dir=td/test-erasure-code-plugins --osd_erasure_code_plugins=jerasure_generic 2026-03-08T22:42:02.845 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:42:02.845 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:02.845 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:02.845 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:02.845 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:02.846 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:02.846 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:02.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-code-plugins/a '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-code-plugins/log --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_erasure_code_plugins=jerasure_generic 2026-03-08T22:42:02.876 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:42:02.876 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:42:02.876 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:02.876 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:02.876 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:42:02.877 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:42:02.877 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:02.877 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:02.877 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:02.877 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:02.877 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:02.877 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:02.878 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:42:02.878 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:02.878 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get fsid 2026-03-08T22:42:02.932 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:42:02.932 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:02.932 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:02.932 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:42:02.932 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:02.932 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:02.932 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:02.932 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:42:02.932 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:02.932 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:02.932 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:02.932 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:42:02.933 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:02.933 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get mon_host 2026-03-08T22:42:02.981 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:46: TEST_preload_warning: run_mgr td/test-erasure-code-plugins x 2026-03-08T22:42:02.981 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:02.981 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:42:02.981 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:42:02.981 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:42:02.981 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-code-plugins/x 2026-03-08T22:42:02.981 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:42:03.095 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:42:03.095 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:03.095 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:03.096 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:03.096 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:03.096 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:03.096 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:03.096 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:03.097 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-code-plugins/x '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:03.120 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:47: TEST_preload_warning: get_asok_path mon.a 2026-03-08T22:42:03.120 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:03.120 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:03.120 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:03.120 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:03.120 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:03.120 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:42:03.120 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:47: TEST_preload_warning: CEPH_ARGS= 2026-03-08T22:42:03.120 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:47: TEST_preload_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-mon.a.asok log flush 2026-03-08T22:42:03.168 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:42:03.174 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:48: TEST_preload_warning: run_osd td/test-erasure-code-plugins 0 --osd_erasure_code_plugins=jerasure_generic 2026-03-08T22:42:03.174 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:03.174 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:42:03.174 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:42:03.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:42:03.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code-plugins/0 2026-03-08T22:42:03.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 ' 2026-03-08T22:42:03.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:42:03.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:42:03.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:42:03.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code-plugins/0' 2026-03-08T22:42:03.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code-plugins/0/journal' 2026-03-08T22:42:03.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:42:03.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:42:03.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code-plugins' 2026-03-08T22:42:03.175 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:42:03.175 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:03.175 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:03.175 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:03.175 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:03.175 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:03.175 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:03.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:03.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:42:03.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:42:03.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:42:03.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code-plugins/$name.log' 2026-03-08T22:42:03.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code-plugins/$name.pid' 2026-03-08T22:42:03.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:42:03.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:42:03.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:42:03.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:42:03.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:42:03.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_erasure_code_plugins=jerasure_generic 2026-03-08T22:42:03.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code-plugins/0 2026-03-08T22:42:03.177 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:42:03.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=186df1f9-8e3e-4415-9ced-327c25788b02 2026-03-08T22:42:03.178 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 186df1f9-8e3e-4415-9ced-327c25788b02 2026-03-08T22:42:03.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 186df1f9-8e3e-4415-9ced-327c25788b02' 2026-03-08T22:42:03.178 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:42:03.192 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQA7+61pu/iKCxAA7WqfOzOaM2KN+M2BROQq1g== 2026-03-08T22:42:03.192 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQA7+61pu/iKCxAA7WqfOzOaM2KN+M2BROQq1g=="}' 2026-03-08T22:42:03.192 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 186df1f9-8e3e-4415-9ced-327c25788b02 -i td/test-erasure-code-plugins/0/new.json 2026-03-08T22:42:03.311 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:42:03.320 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code-plugins/0/new.json 2026-03-08T22:42:03.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_erasure_code_plugins=jerasure_generic --mkfs --key AQA7+61pu/iKCxAA7WqfOzOaM2KN+M2BROQq1g== --osd-uuid 186df1f9-8e3e-4415-9ced-327c25788b02 2026-03-08T22:42:03.342 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:03.344+0000 7fe6a4a1c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:03.344 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:03.346+0000 7fe6a4a1c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:03.345 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:03.348+0000 7fe6a4a1c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:03.345 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:03.348+0000 7fe6a4a1c780 -1 bdev(0x5593c2685c00 td/test-erasure-code-plugins/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:42:03.345 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:03.348+0000 7fe6a4a1c780 -1 bluestore(td/test-erasure-code-plugins/0) _read_fsid unparsable uuid 2026-03-08T22:42:05.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code-plugins/0/keyring 2026-03-08T22:42:05.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:42:05.448 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:42:05.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:42:05.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code-plugins/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:42:05.750 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:42:05.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:42:05.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_erasure_code_plugins=jerasure_generic 2026-03-08T22:42:05.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:42:05.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:42:05.753 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:42:05.768 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:05.770+0000 7fbdc86b1780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:05.775 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:05.778+0000 7fbdc86b1780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:05.777 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:05.779+0000 7fbdc86b1780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:05.975 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:42:05.975 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:42:05.975 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:42:05.975 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:42:05.975 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:42:05.976 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:42:05.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:05.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:42:05.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:05.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:06.192 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:06.328 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:06.330+0000 7fbdc86b1780 -1 Falling back to public interface 2026-03-08T22:42:07.193 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:07.194 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:07.194 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:42:07.194 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:42:07.194 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:07.194 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:07.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:07.950 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:07.953+0000 7fbdc86b1780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:42:08.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:08.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:08.433 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:42:08.433 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:42:08.435 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:08.435 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:08.665 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:09.666 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:42:09.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:09.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:09.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:42:09.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:09.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:09.797 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:09.800+0000 7fbdc3e50640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:42:09.897 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/4151230881,v1:127.0.0.1:6803/4151230881] [v2:127.0.0.1:6804/4151230881,v1:127.0.0.1:6805/4151230881] exists,up 186df1f9-8e3e-4415-9ced-327c25788b02 2026-03-08T22:42:09.897 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:42:09.897 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:42:09.897 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:42:09.897 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:49: TEST_preload_warning: get_asok_path osd.0 2026-03-08T22:42:09.897 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:09.897 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:09.897 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:09.897 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:09.898 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:09.898 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-osd.0.asok 2026-03-08T22:42:09.898 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:49: TEST_preload_warning: CEPH_ARGS= 2026-03-08T22:42:09.898 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:49: TEST_preload_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-osd.0.asok log flush 2026-03-08T22:42:09.948 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:42:09.953 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:50: TEST_preload_warning: grep 'WARNING: osd_erasure_code_plugins contains plugin jerasure_generic' td/test-erasure-code-plugins/mon.a.log 2026-03-08T22:42:09.954 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:42:02.868+0000 7f99d700cd80 0 WARNING: osd_erasure_code_plugins contains plugin jerasure_generic that is now deprecated. Please modify the value for osd_erasure_code_plugins to use jerasure instead. 2026-03-08T22:42:09.955 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:51: TEST_preload_warning: grep 'WARNING: osd_erasure_code_plugins contains plugin jerasure_generic' td/test-erasure-code-plugins/osd.0.log 2026-03-08T22:42:09.955 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:42:06.331+0000 7fbdc86b1780 0 WARNING: osd_erasure_code_plugins contains plugin jerasure_generic that is now deprecated. Please modify the value for osd_erasure_code_plugins to use jerasure instead. 2026-03-08T22:42:09.956 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:52: TEST_preload_warning: teardown td/test-erasure-code-plugins 2026-03-08T22:42:09.956 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:09.956 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:09.956 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:42:09.956 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:09.956 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:09.956 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:09.956 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:09.956 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:10.075 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:10.076 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:10.076 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:10.077 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:10.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:10.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:10.078 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:10.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:10.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:10.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:10.079 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:10.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:10.081 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:10.081 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:42:10.087 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:10.087 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:10.087 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:10.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:42:10.088 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:10.088 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:10.088 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:43: TEST_preload_warning: for plugin in ${legacy_jerasure_plugins[*]} ${legacy_shec_plugins[*]} 2026-03-08T22:42:10.088 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:44: TEST_preload_warning: setup td/test-erasure-code-plugins 2026-03-08T22:42:10.088 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:10.088 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-code-plugins 2026-03-08T22:42:10.088 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:10.088 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:10.088 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:42:10.089 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:10.089 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:10.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:10.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:10.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:10.090 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:10.091 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:10.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:10.092 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:10.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:10.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:10.093 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:10.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:10.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:10.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:10.094 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:10.095 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:10.095 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:10.095 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:42:10.096 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:10.096 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:10.096 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:10.097 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:42:10.097 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:10.097 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:10.097 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-code-plugins 2026-03-08T22:42:10.098 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:42:10.098 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:10.098 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:10.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.50927 2026-03-08T22:42:10.100 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:42:10.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:42:10.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:42:10.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-code-plugins 1' TERM HUP INT 2026-03-08T22:42:10.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:45: TEST_preload_warning: run_mon td/test-erasure-code-plugins a --osd_erasure_code_plugins=jerasure_sse3 2026-03-08T22:42:10.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:10.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:42:10.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:42:10.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:42:10.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-code-plugins/a 2026-03-08T22:42:10.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-code-plugins/a --run-dir=td/test-erasure-code-plugins --osd_erasure_code_plugins=jerasure_sse3 2026-03-08T22:42:10.128 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:42:10.128 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:10.128 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:10.128 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:10.128 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:10.128 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:10.128 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:10.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-code-plugins/a '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-code-plugins/log --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_erasure_code_plugins=jerasure_sse3 2026-03-08T22:42:10.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:42:10.162 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:42:10.162 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:10.162 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:10.163 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:42:10.164 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:42:10.165 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:10.165 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:10.165 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:10.165 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:10.165 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:10.165 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:10.166 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:42:10.166 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:10.166 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get fsid 2026-03-08T22:42:10.223 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:42:10.223 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:10.223 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:10.223 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:42:10.224 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:42:10.224 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:10.224 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:10.224 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:10.224 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:10.224 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:10.224 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:10.224 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:42:10.225 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:10.225 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get mon_host 2026-03-08T22:42:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:46: TEST_preload_warning: run_mgr td/test-erasure-code-plugins x 2026-03-08T22:42:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:42:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:42:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:42:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-code-plugins/x 2026-03-08T22:42:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:42:10.400 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:42:10.400 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:10.400 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:10.400 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:10.400 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:10.400 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:10.400 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:10.401 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:10.402 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-code-plugins/x '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:10.424 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:47: TEST_preload_warning: get_asok_path mon.a 2026-03-08T22:42:10.426 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:10.426 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:10.426 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:10.426 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:10.426 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:10.426 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:42:10.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:47: TEST_preload_warning: CEPH_ARGS= 2026-03-08T22:42:10.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:47: TEST_preload_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-mon.a.asok log flush 2026-03-08T22:42:10.470 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:42:10.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:48: TEST_preload_warning: run_osd td/test-erasure-code-plugins 0 --osd_erasure_code_plugins=jerasure_sse3 2026-03-08T22:42:10.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:10.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:42:10.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:42:10.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:42:10.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code-plugins/0 2026-03-08T22:42:10.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 ' 2026-03-08T22:42:10.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:42:10.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:42:10.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:42:10.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code-plugins/0' 2026-03-08T22:42:10.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code-plugins/0/journal' 2026-03-08T22:42:10.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:42:10.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:42:10.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code-plugins' 2026-03-08T22:42:10.476 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:42:10.477 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:10.477 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:10.477 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:10.477 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:10.477 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:10.477 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:10.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:10.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:42:10.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:42:10.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:42:10.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code-plugins/$name.log' 2026-03-08T22:42:10.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code-plugins/$name.pid' 2026-03-08T22:42:10.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:42:10.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:42:10.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:42:10.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:42:10.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:42:10.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_erasure_code_plugins=jerasure_sse3 2026-03-08T22:42:10.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code-plugins/0 2026-03-08T22:42:10.478 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:42:10.479 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=2d31f5ad-874f-494d-986f-9d04e5821594 2026-03-08T22:42:10.479 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 2d31f5ad-874f-494d-986f-9d04e5821594 2026-03-08T22:42:10.479 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 2d31f5ad-874f-494d-986f-9d04e5821594' 2026-03-08T22:42:10.479 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:42:10.494 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBC+61pknaEHRAAmL1DWKbPp3h8MkbB9DES7w== 2026-03-08T22:42:10.494 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBC+61pknaEHRAAmL1DWKbPp3h8MkbB9DES7w=="}' 2026-03-08T22:42:10.494 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 2d31f5ad-874f-494d-986f-9d04e5821594 -i td/test-erasure-code-plugins/0/new.json 2026-03-08T22:42:10.630 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:42:10.641 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code-plugins/0/new.json 2026-03-08T22:42:10.644 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_erasure_code_plugins=jerasure_sse3 --mkfs --key AQBC+61pknaEHRAAmL1DWKbPp3h8MkbB9DES7w== --osd-uuid 2d31f5ad-874f-494d-986f-9d04e5821594 2026-03-08T22:42:10.661 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:10.663+0000 7f50d6c3f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:10.662 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:10.665+0000 7f50d6c3f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:10.662 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:10.665+0000 7f50d6c3f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:10.663 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:10.666+0000 7f50d6c3f780 -1 bdev(0x5582e0e2fc00 td/test-erasure-code-plugins/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:42:10.663 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:10.666+0000 7f50d6c3f780 -1 bluestore(td/test-erasure-code-plugins/0) _read_fsid unparsable uuid 2026-03-08T22:42:13.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code-plugins/0/keyring 2026-03-08T22:42:13.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:42:13.042 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:42:13.042 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:42:13.042 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code-plugins/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:42:13.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:42:13.326 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:42:13.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_erasure_code_plugins=jerasure_sse3 2026-03-08T22:42:13.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:42:13.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:42:13.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:42:13.346 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:13.348+0000 7f3752319780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:13.353 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:13.356+0000 7f3752319780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:13.354 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:13.357+0000 7f3752319780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:13.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:42:13.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:42:13.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:42:13.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:42:13.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:42:13.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:13.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:42:13.584 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:42:13.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:13.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:13.802 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:14.415 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:14.418+0000 7f3752319780 -1 Falling back to public interface 2026-03-08T22:42:14.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:14.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:14.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:42:14.804 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:42:14.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:14.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:15.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:15.277 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:15.280+0000 7f3752319780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:42:16.188 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:42:16.188 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:16.188 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:16.188 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:42:16.188 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:16.188 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:16.439 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:16.730 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:16.732+0000 7f374daba640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:42:17.443 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:42:17.443 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:17.443 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:17.443 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:42:17.443 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:17.443 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:17.655 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3978398632,v1:127.0.0.1:6803/3978398632] [v2:127.0.0.1:6804/3978398632,v1:127.0.0.1:6805/3978398632] exists,up 2d31f5ad-874f-494d-986f-9d04e5821594 2026-03-08T22:42:17.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:42:17.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:42:17.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:42:17.656 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:49: TEST_preload_warning: get_asok_path osd.0 2026-03-08T22:42:17.656 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:17.656 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:17.656 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:17.656 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:17.656 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:17.656 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-osd.0.asok 2026-03-08T22:42:17.657 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:49: TEST_preload_warning: CEPH_ARGS= 2026-03-08T22:42:17.657 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:49: TEST_preload_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-osd.0.asok log flush 2026-03-08T22:42:17.710 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:42:17.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:50: TEST_preload_warning: grep 'WARNING: osd_erasure_code_plugins contains plugin jerasure_sse3' td/test-erasure-code-plugins/mon.a.log 2026-03-08T22:42:17.717 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:42:10.151+0000 7fa07b61cd80 0 WARNING: osd_erasure_code_plugins contains plugin jerasure_sse3 that is now deprecated. Please modify the value for osd_erasure_code_plugins to use jerasure instead. 2026-03-08T22:42:17.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:51: TEST_preload_warning: grep 'WARNING: osd_erasure_code_plugins contains plugin jerasure_sse3' td/test-erasure-code-plugins/osd.0.log 2026-03-08T22:42:17.718 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:42:14.419+0000 7f3752319780 0 WARNING: osd_erasure_code_plugins contains plugin jerasure_sse3 that is now deprecated. Please modify the value for osd_erasure_code_plugins to use jerasure instead. 2026-03-08T22:42:17.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:52: TEST_preload_warning: teardown td/test-erasure-code-plugins 2026-03-08T22:42:17.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:17.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:17.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:42:17.719 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:17.719 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:17.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:17.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:17.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:17.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:17.829 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:17.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:17.830 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:17.831 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:17.831 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:17.831 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:17.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:17.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:17.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:17.832 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:17.833 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:17.834 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:17.834 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:42:17.842 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:17.842 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:17.842 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:17.842 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:42:17.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:17.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:17.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:43: TEST_preload_warning: for plugin in ${legacy_jerasure_plugins[*]} ${legacy_shec_plugins[*]} 2026-03-08T22:42:17.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:44: TEST_preload_warning: setup td/test-erasure-code-plugins 2026-03-08T22:42:17.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:17.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-code-plugins 2026-03-08T22:42:17.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:17.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:17.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:42:17.844 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:17.844 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:17.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:17.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:17.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:17.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:17.847 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:17.848 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:17.848 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:17.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:17.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:17.849 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:17.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:17.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:17.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:17.850 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:17.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:17.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:17.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:42:17.853 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:17.853 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:17.853 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:17.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:42:17.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:17.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:17.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-code-plugins 2026-03-08T22:42:17.855 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:42:17.855 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:17.855 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:17.856 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.50927 2026-03-08T22:42:17.857 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:42:17.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:42:17.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:42:17.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-code-plugins 1' TERM HUP INT 2026-03-08T22:42:17.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:45: TEST_preload_warning: run_mon td/test-erasure-code-plugins a --osd_erasure_code_plugins=jerasure_sse4 2026-03-08T22:42:17.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:17.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:42:17.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:42:17.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:42:17.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-code-plugins/a 2026-03-08T22:42:17.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-code-plugins/a --run-dir=td/test-erasure-code-plugins --osd_erasure_code_plugins=jerasure_sse4 2026-03-08T22:42:17.886 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:42:17.886 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:17.886 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:17.886 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:17.886 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:17.886 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:17.886 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:17.887 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-code-plugins/a '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-code-plugins/log --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_erasure_code_plugins=jerasure_sse4 2026-03-08T22:42:17.917 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:42:17.920 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:42:17.920 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:17.920 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:17.920 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:42:17.920 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:42:17.920 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:17.920 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:17.921 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:17.921 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:17.921 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:17.921 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:17.923 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:42:17.923 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:17.923 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get fsid 2026-03-08T22:42:17.977 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:42:17.977 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:17.977 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:17.977 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:42:17.977 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:42:17.978 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:17.978 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:17.978 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:17.978 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:17.978 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:17.978 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:17.979 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:42:17.979 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:17.979 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get mon_host 2026-03-08T22:42:18.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:46: TEST_preload_warning: run_mgr td/test-erasure-code-plugins x 2026-03-08T22:42:18.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:18.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:42:18.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:42:18.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:42:18.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-code-plugins/x 2026-03-08T22:42:18.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:42:18.152 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:42:18.152 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:18.152 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:18.153 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:18.153 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:18.153 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:18.153 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:18.153 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:18.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-code-plugins/x '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:18.178 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:47: TEST_preload_warning: get_asok_path mon.a 2026-03-08T22:42:18.178 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:18.178 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:18.178 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:18.178 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:18.178 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:18.178 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:42:18.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:47: TEST_preload_warning: CEPH_ARGS= 2026-03-08T22:42:18.179 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:47: TEST_preload_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-mon.a.asok log flush 2026-03-08T22:42:18.227 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:42:18.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:48: TEST_preload_warning: run_osd td/test-erasure-code-plugins 0 --osd_erasure_code_plugins=jerasure_sse4 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code-plugins/0 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 ' 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code-plugins/0' 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code-plugins/0/journal' 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code-plugins' 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:18.235 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:18.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:18.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:42:18.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:42:18.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:42:18.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code-plugins/$name.log' 2026-03-08T22:42:18.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code-plugins/$name.pid' 2026-03-08T22:42:18.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:42:18.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:42:18.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:42:18.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:42:18.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:42:18.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_erasure_code_plugins=jerasure_sse4 2026-03-08T22:42:18.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code-plugins/0 2026-03-08T22:42:18.238 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:42:18.239 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=cd3a3ab9-9d79-4745-b049-4fd9a2774493 2026-03-08T22:42:18.239 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 cd3a3ab9-9d79-4745-b049-4fd9a2774493' 2026-03-08T22:42:18.239 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 cd3a3ab9-9d79-4745-b049-4fd9a2774493 2026-03-08T22:42:18.239 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:42:18.253 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBK+61pYDQrDxAAjRC/geLIbsst40GVmwWr9g== 2026-03-08T22:42:18.253 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBK+61pYDQrDxAAjRC/geLIbsst40GVmwWr9g=="}' 2026-03-08T22:42:18.253 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new cd3a3ab9-9d79-4745-b049-4fd9a2774493 -i td/test-erasure-code-plugins/0/new.json 2026-03-08T22:42:18.400 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:42:18.414 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code-plugins/0/new.json 2026-03-08T22:42:18.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_erasure_code_plugins=jerasure_sse4 --mkfs --key AQBK+61pYDQrDxAAjRC/geLIbsst40GVmwWr9g== --osd-uuid cd3a3ab9-9d79-4745-b049-4fd9a2774493 2026-03-08T22:42:18.450 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:18.452+0000 7ff36e704780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:18.453 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:18.456+0000 7ff36e704780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:18.454 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:18.457+0000 7ff36e704780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:18.455 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:18.458+0000 7ff36e704780 -1 bdev(0x55d53a022800 td/test-erasure-code-plugins/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:42:18.455 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:18.458+0000 7ff36e704780 -1 bluestore(td/test-erasure-code-plugins/0) _read_fsid unparsable uuid 2026-03-08T22:42:20.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code-plugins/0/keyring 2026-03-08T22:42:20.873 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:42:20.873 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:42:20.873 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:42:20.873 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code-plugins/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:42:21.314 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:42:21.314 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:42:21.314 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_erasure_code_plugins=jerasure_sse4 2026-03-08T22:42:21.314 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:42:21.315 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:42:21.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:42:21.333 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:21.335+0000 7f06523b4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:21.337 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:21.340+0000 7f06523b4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:21.342 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:21.341+0000 7f06523b4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:21.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:42:21.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:42:21.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:42:21.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:42:21.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:42:21.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:21.559 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:42:21.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:42:21.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:21.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:21.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:22.167 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:22.169+0000 7f06523b4780 -1 Falling back to public interface 2026-03-08T22:42:22.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:22.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:22.796 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:42:22.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:42:22.797 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:22.797 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:23.013 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:23.016+0000 7f06523b4780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:42:23.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:24.040 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:42:24.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:24.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:24.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:42:24.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:24.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:24.297 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:25.300 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:42:25.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:25.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:25.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:42:25.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:25.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:25.536 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/1358872689,v1:127.0.0.1:6803/1358872689] [v2:127.0.0.1:6804/1358872689,v1:127.0.0.1:6805/1358872689] exists,up cd3a3ab9-9d79-4745-b049-4fd9a2774493 2026-03-08T22:42:25.536 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:42:25.537 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:42:25.537 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:42:25.537 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:49: TEST_preload_warning: get_asok_path osd.0 2026-03-08T22:42:25.537 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:25.537 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:25.537 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:25.537 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:25.537 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:25.537 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-osd.0.asok 2026-03-08T22:42:25.538 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:49: TEST_preload_warning: CEPH_ARGS= 2026-03-08T22:42:25.538 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:49: TEST_preload_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-osd.0.asok log flush 2026-03-08T22:42:25.589 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:42:25.595 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:50: TEST_preload_warning: grep 'WARNING: osd_erasure_code_plugins contains plugin jerasure_sse4' td/test-erasure-code-plugins/mon.a.log 2026-03-08T22:42:25.596 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:42:17.908+0000 7f74be83bd80 0 WARNING: osd_erasure_code_plugins contains plugin jerasure_sse4 that is now deprecated. Please modify the value for osd_erasure_code_plugins to use jerasure instead. 2026-03-08T22:42:25.596 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:51: TEST_preload_warning: grep 'WARNING: osd_erasure_code_plugins contains plugin jerasure_sse4' td/test-erasure-code-plugins/osd.0.log 2026-03-08T22:42:25.598 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:42:22.171+0000 7f06523b4780 0 WARNING: osd_erasure_code_plugins contains plugin jerasure_sse4 that is now deprecated. Please modify the value for osd_erasure_code_plugins to use jerasure instead. 2026-03-08T22:42:25.598 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:52: TEST_preload_warning: teardown td/test-erasure-code-plugins 2026-03-08T22:42:25.598 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:25.598 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:25.598 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:42:25.599 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:25.599 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:25.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:25.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:25.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:25.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:25.714 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:25.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:25.715 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:25.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:25.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:25.716 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:25.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:25.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:25.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:25.717 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:25.718 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:25.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:25.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:42:25.725 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:25.725 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:25.725 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:25.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:42:25.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:25.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:25.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:43: TEST_preload_warning: for plugin in ${legacy_jerasure_plugins[*]} ${legacy_shec_plugins[*]} 2026-03-08T22:42:25.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:44: TEST_preload_warning: setup td/test-erasure-code-plugins 2026-03-08T22:42:25.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:25.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-code-plugins 2026-03-08T22:42:25.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:25.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:25.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:42:25.726 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:25.726 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:25.726 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:25.726 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:25.726 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:25.728 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:25.728 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:25.729 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:25.729 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:25.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:25.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:25.730 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:25.731 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:25.731 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:25.732 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:25.732 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:25.733 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:25.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:25.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:42:25.735 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:25.735 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:25.735 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:25.735 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:42:25.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:25.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:25.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-code-plugins 2026-03-08T22:42:25.737 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:42:25.737 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:25.737 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:25.737 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.50927 2026-03-08T22:42:25.739 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:42:25.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:42:25.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:42:25.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-code-plugins 1' TERM HUP INT 2026-03-08T22:42:25.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:45: TEST_preload_warning: run_mon td/test-erasure-code-plugins a --osd_erasure_code_plugins=shec_generic 2026-03-08T22:42:25.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:25.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:42:25.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:42:25.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:42:25.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-code-plugins/a 2026-03-08T22:42:25.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-code-plugins/a --run-dir=td/test-erasure-code-plugins --osd_erasure_code_plugins=shec_generic 2026-03-08T22:42:25.769 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:42:25.769 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:25.769 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:25.769 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:25.769 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:25.769 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:25.769 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:25.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-code-plugins/a '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-code-plugins/log --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_erasure_code_plugins=shec_generic 2026-03-08T22:42:25.802 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:42:25.802 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:42:25.803 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:25.803 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:25.803 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:42:25.804 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:42:25.804 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:25.805 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:25.805 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:25.806 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:25.806 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:25.806 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:25.807 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:42:25.807 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:25.807 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get fsid 2026-03-08T22:42:25.869 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:42:25.869 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:25.869 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:25.869 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:42:25.869 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:42:25.870 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:25.870 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:25.870 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:25.872 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:25.872 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:25.872 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:25.873 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:42:25.873 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:25.873 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get mon_host 2026-03-08T22:42:25.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:46: TEST_preload_warning: run_mgr td/test-erasure-code-plugins x 2026-03-08T22:42:25.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:25.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:42:25.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:42:25.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:42:25.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-code-plugins/x 2026-03-08T22:42:25.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:42:26.047 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:42:26.047 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:26.047 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:26.048 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:26.048 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:26.048 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:26.048 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:26.048 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:26.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-code-plugins/x '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:26.070 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:47: TEST_preload_warning: get_asok_path mon.a 2026-03-08T22:42:26.071 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:26.071 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:26.071 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:26.071 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:26.071 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:26.071 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:42:26.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:47: TEST_preload_warning: CEPH_ARGS= 2026-03-08T22:42:26.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:47: TEST_preload_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-mon.a.asok log flush 2026-03-08T22:42:26.122 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:42:26.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:48: TEST_preload_warning: run_osd td/test-erasure-code-plugins 0 --osd_erasure_code_plugins=shec_generic 2026-03-08T22:42:26.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:26.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:42:26.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:42:26.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:42:26.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code-plugins/0 2026-03-08T22:42:26.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 ' 2026-03-08T22:42:26.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:42:26.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:42:26.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:42:26.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code-plugins/0' 2026-03-08T22:42:26.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code-plugins/0/journal' 2026-03-08T22:42:26.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:42:26.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:42:26.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code-plugins' 2026-03-08T22:42:26.127 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:42:26.127 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:26.127 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:26.127 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:26.127 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:26.128 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:26.128 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:26.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:26.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:42:26.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:42:26.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:42:26.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code-plugins/$name.log' 2026-03-08T22:42:26.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code-plugins/$name.pid' 2026-03-08T22:42:26.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:42:26.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:42:26.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:42:26.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:42:26.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:42:26.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_erasure_code_plugins=shec_generic 2026-03-08T22:42:26.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code-plugins/0 2026-03-08T22:42:26.129 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:42:26.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=8094bb2b-0b11-450a-b54c-dad29607a83c 2026-03-08T22:42:26.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 8094bb2b-0b11-450a-b54c-dad29607a83c' 2026-03-08T22:42:26.130 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 8094bb2b-0b11-450a-b54c-dad29607a83c 2026-03-08T22:42:26.130 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:42:26.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBS+61pxz67CBAAeGH16gf6WyLJfXNFfJw4Sw== 2026-03-08T22:42:26.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBS+61pxz67CBAAeGH16gf6WyLJfXNFfJw4Sw=="}' 2026-03-08T22:42:26.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 8094bb2b-0b11-450a-b54c-dad29607a83c -i td/test-erasure-code-plugins/0/new.json 2026-03-08T22:42:26.277 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:42:26.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code-plugins/0/new.json 2026-03-08T22:42:26.293 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_erasure_code_plugins=shec_generic --mkfs --key AQBS+61pxz67CBAAeGH16gf6WyLJfXNFfJw4Sw== --osd-uuid 8094bb2b-0b11-450a-b54c-dad29607a83c 2026-03-08T22:42:26.316 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:26.317+0000 7f60c1095780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:26.317 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:26.319+0000 7f60c1095780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:26.318 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:26.320+0000 7f60c1095780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:26.318 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:26.321+0000 7f60c1095780 -1 bdev(0x56211e100800 td/test-erasure-code-plugins/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:42:26.318 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:26.321+0000 7f60c1095780 -1 bluestore(td/test-erasure-code-plugins/0) _read_fsid unparsable uuid 2026-03-08T22:42:28.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code-plugins/0/keyring 2026-03-08T22:42:28.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:42:28.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:42:28.431 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:42:28.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code-plugins/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:42:28.726 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:42:28.727 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:42:28.727 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_erasure_code_plugins=shec_generic 2026-03-08T22:42:28.727 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:42:28.728 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:42:28.729 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:42:28.744 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:28.747+0000 7efc44293780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:28.751 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:28.754+0000 7efc44293780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:28.752 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:28.755+0000 7efc44293780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:28.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:42:28.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:42:28.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:42:28.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:42:28.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:42:28.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:28.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:42:28.951 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:42:28.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:28.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:29.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:29.818 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:29.820+0000 7efc44293780 -1 Falling back to public interface 2026-03-08T22:42:30.169 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:30.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:30.170 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:42:30.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:42:30.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:30.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:30.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:30.652 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:30.654+0000 7efc44293780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:42:31.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:31.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:31.407 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:42:31.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:42:31.408 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:31.408 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:31.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:32.754 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:42:32.754 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:32.754 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:32.754 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:42:32.754 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:32.754 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:32.969 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/1550064011,v1:127.0.0.1:6803/1550064011] [v2:127.0.0.1:6804/1550064011,v1:127.0.0.1:6805/1550064011] exists,up 8094bb2b-0b11-450a-b54c-dad29607a83c 2026-03-08T22:42:32.969 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:42:32.969 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:42:32.969 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:42:32.970 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:49: TEST_preload_warning: get_asok_path osd.0 2026-03-08T22:42:32.970 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:32.970 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:32.970 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:32.970 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:32.970 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:32.970 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-osd.0.asok 2026-03-08T22:42:32.970 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:49: TEST_preload_warning: CEPH_ARGS= 2026-03-08T22:42:32.971 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:49: TEST_preload_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-osd.0.asok log flush 2026-03-08T22:42:33.022 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:42:33.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:50: TEST_preload_warning: grep 'WARNING: osd_erasure_code_plugins contains plugin shec_generic' td/test-erasure-code-plugins/mon.a.log 2026-03-08T22:42:33.030 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:42:25.793+0000 7fe0e63dad80 0 WARNING: osd_erasure_code_plugins contains plugin shec_generic that is now deprecated. Please modify the value for osd_erasure_code_plugins to use shec instead. 2026-03-08T22:42:33.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:51: TEST_preload_warning: grep 'WARNING: osd_erasure_code_plugins contains plugin shec_generic' td/test-erasure-code-plugins/osd.0.log 2026-03-08T22:42:33.031 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:42:29.822+0000 7efc44293780 0 WARNING: osd_erasure_code_plugins contains plugin shec_generic that is now deprecated. Please modify the value for osd_erasure_code_plugins to use shec instead. 2026-03-08T22:42:33.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:52: TEST_preload_warning: teardown td/test-erasure-code-plugins 2026-03-08T22:42:33.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:33.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:33.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:42:33.031 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:33.032 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:33.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:33.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:33.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:33.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:33.146 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:33.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:33.147 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:33.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:33.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:33.148 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:33.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:33.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:33.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:33.149 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:33.150 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:33.150 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:33.150 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:42:33.156 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:33.156 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:33.156 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:33.156 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:42:33.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:33.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:33.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:43: TEST_preload_warning: for plugin in ${legacy_jerasure_plugins[*]} ${legacy_shec_plugins[*]} 2026-03-08T22:42:33.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:44: TEST_preload_warning: setup td/test-erasure-code-plugins 2026-03-08T22:42:33.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:33.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-code-plugins 2026-03-08T22:42:33.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:33.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:33.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:42:33.157 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:33.157 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:33.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:33.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:33.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:33.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:33.159 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:33.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:33.160 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:33.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:33.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:33.161 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:33.162 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:33.162 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:33.162 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:33.162 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:33.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:33.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:33.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:42:33.164 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:33.164 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:33.164 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:33.164 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:42:33.165 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:33.165 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:33.165 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-code-plugins 2026-03-08T22:42:33.166 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:42:33.166 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:33.166 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:33.166 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.50927 2026-03-08T22:42:33.167 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:42:33.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:42:33.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:42:33.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-code-plugins 1' TERM HUP INT 2026-03-08T22:42:33.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:45: TEST_preload_warning: run_mon td/test-erasure-code-plugins a --osd_erasure_code_plugins=shec_sse3 2026-03-08T22:42:33.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:33.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:42:33.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:42:33.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:42:33.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-code-plugins/a 2026-03-08T22:42:33.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-code-plugins/a --run-dir=td/test-erasure-code-plugins --osd_erasure_code_plugins=shec_sse3 2026-03-08T22:42:33.192 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:42:33.193 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:33.193 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:33.193 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:33.193 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:33.193 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:33.193 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:33.193 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-code-plugins/a '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-code-plugins/log --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_erasure_code_plugins=shec_sse3 2026-03-08T22:42:33.223 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:42:33.223 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:42:33.223 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:33.223 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:33.223 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:42:33.224 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:42:33.224 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:33.224 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:33.224 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:33.225 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:33.225 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:33.225 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:33.225 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:42:33.225 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:33.225 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get fsid 2026-03-08T22:42:33.277 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:42:33.277 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:33.277 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:33.277 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:42:33.277 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:42:33.277 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:33.277 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:33.277 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:33.278 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:33.278 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:33.278 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:33.278 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:42:33.278 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:33.278 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get mon_host 2026-03-08T22:42:33.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:46: TEST_preload_warning: run_mgr td/test-erasure-code-plugins x 2026-03-08T22:42:33.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:33.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:42:33.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:42:33.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:42:33.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-code-plugins/x 2026-03-08T22:42:33.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:42:33.447 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:42:33.447 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:33.447 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:33.447 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:33.448 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:33.448 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:33.448 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:33.448 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:33.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-code-plugins/x '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:33.470 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:47: TEST_preload_warning: get_asok_path mon.a 2026-03-08T22:42:33.470 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:33.470 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:33.470 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:33.470 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:33.470 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:33.470 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:42:33.471 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:47: TEST_preload_warning: CEPH_ARGS= 2026-03-08T22:42:33.471 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:47: TEST_preload_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-mon.a.asok log flush 2026-03-08T22:42:33.521 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:42:33.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:48: TEST_preload_warning: run_osd td/test-erasure-code-plugins 0 --osd_erasure_code_plugins=shec_sse3 2026-03-08T22:42:33.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:33.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:42:33.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:42:33.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:42:33.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code-plugins/0 2026-03-08T22:42:33.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 ' 2026-03-08T22:42:33.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:42:33.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:42:33.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:42:33.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code-plugins/0' 2026-03-08T22:42:33.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code-plugins/0/journal' 2026-03-08T22:42:33.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:42:33.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:42:33.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code-plugins' 2026-03-08T22:42:33.527 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:42:33.527 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:33.527 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:33.527 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:33.527 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:33.527 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:33.527 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:33.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:33.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:42:33.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:42:33.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:42:33.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code-plugins/$name.log' 2026-03-08T22:42:33.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code-plugins/$name.pid' 2026-03-08T22:42:33.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:42:33.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:42:33.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:42:33.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:42:33.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:42:33.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_erasure_code_plugins=shec_sse3 2026-03-08T22:42:33.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code-plugins/0 2026-03-08T22:42:33.528 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:42:33.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=a7828c3e-bcf5-43b3-87c1-d88f1f8a2212 2026-03-08T22:42:33.529 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 a7828c3e-bcf5-43b3-87c1-d88f1f8a2212 2026-03-08T22:42:33.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 a7828c3e-bcf5-43b3-87c1-d88f1f8a2212' 2026-03-08T22:42:33.530 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:42:33.542 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBZ+61pUIhyIBAAsl7NmzhVzqWFGJTCLdPXsQ== 2026-03-08T22:42:33.542 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBZ+61pUIhyIBAAsl7NmzhVzqWFGJTCLdPXsQ=="}' 2026-03-08T22:42:33.542 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new a7828c3e-bcf5-43b3-87c1-d88f1f8a2212 -i td/test-erasure-code-plugins/0/new.json 2026-03-08T22:42:33.667 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:42:33.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code-plugins/0/new.json 2026-03-08T22:42:33.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_erasure_code_plugins=shec_sse3 --mkfs --key AQBZ+61pUIhyIBAAsl7NmzhVzqWFGJTCLdPXsQ== --osd-uuid a7828c3e-bcf5-43b3-87c1-d88f1f8a2212 2026-03-08T22:42:33.700 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:33.701+0000 7fc912b23780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:33.704 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:33.707+0000 7fc912b23780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:33.706 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:33.708+0000 7fc912b23780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:33.706 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:33.708+0000 7fc912b23780 -1 bdev(0x555cd68a3c00 td/test-erasure-code-plugins/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:42:33.706 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:33.708+0000 7fc912b23780 -1 bluestore(td/test-erasure-code-plugins/0) _read_fsid unparsable uuid 2026-03-08T22:42:35.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code-plugins/0/keyring 2026-03-08T22:42:35.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:42:35.909 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:42:35.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:42:35.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code-plugins/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:42:36.213 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:42:36.213 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:42:36.213 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_erasure_code_plugins=shec_sse3 2026-03-08T22:42:36.213 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:42:36.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:42:36.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:42:36.231 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:36.233+0000 7f461960c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:36.239 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:36.242+0000 7f461960c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:36.240 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:36.243+0000 7f461960c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:36.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:42:36.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:42:36.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:42:36.438 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:42:36.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:42:36.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:42:36.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:36.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:42:36.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:36.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:36.667 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:37.303 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:37.306+0000 7f461960c780 -1 Falling back to public interface 2026-03-08T22:42:37.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:37.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:37.668 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:42:37.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:42:37.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:37.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:37.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:38.314 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:38.317+0000 7f461960c780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:42:38.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:38.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:38.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:42:38.890 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:42:38.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:38.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:39.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:40.126 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:42:40.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:40.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:40.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:42:40.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:40.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:40.414 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:41.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:41.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:41.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:42:41.416 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-08T22:42:41.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:41.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:41.639 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/572703765,v1:127.0.0.1:6803/572703765] [v2:127.0.0.1:6804/572703765,v1:127.0.0.1:6805/572703765] exists,up a7828c3e-bcf5-43b3-87c1-d88f1f8a2212 2026-03-08T22:42:41.640 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:42:41.640 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:42:41.640 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:42:41.640 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:49: TEST_preload_warning: get_asok_path osd.0 2026-03-08T22:42:41.640 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:41.640 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:41.641 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:41.641 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:41.641 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:41.641 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-osd.0.asok 2026-03-08T22:42:41.641 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:49: TEST_preload_warning: CEPH_ARGS= 2026-03-08T22:42:41.641 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:49: TEST_preload_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-osd.0.asok log flush 2026-03-08T22:42:41.691 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:42:41.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:50: TEST_preload_warning: grep 'WARNING: osd_erasure_code_plugins contains plugin shec_sse3' td/test-erasure-code-plugins/mon.a.log 2026-03-08T22:42:41.699 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:42:33.215+0000 7f58b87d4d80 0 WARNING: osd_erasure_code_plugins contains plugin shec_sse3 that is now deprecated. Please modify the value for osd_erasure_code_plugins to use shec instead. 2026-03-08T22:42:41.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:51: TEST_preload_warning: grep 'WARNING: osd_erasure_code_plugins contains plugin shec_sse3' td/test-erasure-code-plugins/osd.0.log 2026-03-08T22:42:41.700 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:42:37.307+0000 7f461960c780 0 WARNING: osd_erasure_code_plugins contains plugin shec_sse3 that is now deprecated. Please modify the value for osd_erasure_code_plugins to use shec instead. 2026-03-08T22:42:41.700 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:52: TEST_preload_warning: teardown td/test-erasure-code-plugins 2026-03-08T22:42:41.701 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:41.701 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:41.701 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:42:41.701 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:41.701 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:41.701 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:41.701 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:41.701 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:41.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:41.821 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:41.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:41.822 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:41.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:41.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:41.823 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:41.824 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:41.824 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:41.825 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:41.825 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:41.825 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:41.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:41.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:42:41.831 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:41.831 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:41.831 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:41.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:42:41.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:41.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:41.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:43: TEST_preload_warning: for plugin in ${legacy_jerasure_plugins[*]} ${legacy_shec_plugins[*]} 2026-03-08T22:42:41.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:44: TEST_preload_warning: setup td/test-erasure-code-plugins 2026-03-08T22:42:41.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:41.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-code-plugins 2026-03-08T22:42:41.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:41.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:41.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:42:41.836 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:41.836 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:41.836 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:41.836 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:41.836 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:41.837 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:41.837 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:41.838 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:41.838 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:41.839 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:41.839 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:41.839 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:41.840 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:41.840 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:41.840 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:41.840 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:41.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:41.842 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:41.842 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:42:41.843 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:41.843 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:41.843 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:41.843 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:42:41.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:41.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:41.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-code-plugins 2026-03-08T22:42:41.845 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:42:41.845 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:41.845 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:41.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.50927 2026-03-08T22:42:41.846 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:42:41.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:42:41.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:42:41.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-code-plugins 1' TERM HUP INT 2026-03-08T22:42:41.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:45: TEST_preload_warning: run_mon td/test-erasure-code-plugins a --osd_erasure_code_plugins=shec_sse4 2026-03-08T22:42:41.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:41.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:42:41.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:42:41.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:42:41.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-code-plugins/a 2026-03-08T22:42:41.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-code-plugins/a --run-dir=td/test-erasure-code-plugins --osd_erasure_code_plugins=shec_sse4 2026-03-08T22:42:41.873 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:42:41.874 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:41.874 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:41.874 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:41.874 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:41.874 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:41.874 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:41.874 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-code-plugins/a '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-code-plugins/log --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_erasure_code_plugins=shec_sse4 2026-03-08T22:42:41.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:42:41.901 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:42:41.901 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:41.901 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:41.901 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:42:41.901 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:42:41.902 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:41.902 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:41.902 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:41.902 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:41.902 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:41.902 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:41.902 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:42:41.903 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:41.903 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get fsid 2026-03-08T22:42:41.957 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:42:41.957 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:41.957 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:41.957 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:42:41.958 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:42:41.958 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:41.958 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:41.958 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:41.958 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:41.958 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:41.958 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:41.959 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:42:41.959 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:41.959 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50927/ceph-mon.a.asok config get mon_host 2026-03-08T22:42:42.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:46: TEST_preload_warning: run_mgr td/test-erasure-code-plugins x 2026-03-08T22:42:42.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:42.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:42:42.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:42:42.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:42:42.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-code-plugins/x 2026-03-08T22:42:42.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:42:42.129 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:42:42.129 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:42.129 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:42.130 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:42.130 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:42.130 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:42.130 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:42.130 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:42.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-code-plugins/x '--log-file=td/test-erasure-code-plugins/$name.log' '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --run-dir=td/test-erasure-code-plugins '--pid-file=td/test-erasure-code-plugins/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:42.152 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:47: TEST_preload_warning: get_asok_path mon.a 2026-03-08T22:42:42.152 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:42.152 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:42.152 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:42.152 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:42.152 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:42.153 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-mon.a.asok 2026-03-08T22:42:42.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:47: TEST_preload_warning: CEPH_ARGS= 2026-03-08T22:42:42.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:47: TEST_preload_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-mon.a.asok log flush 2026-03-08T22:42:42.196 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:42:42.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:48: TEST_preload_warning: run_osd td/test-erasure-code-plugins 0 --osd_erasure_code_plugins=shec_sse4 2026-03-08T22:42:42.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:42.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:42:42.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:42:42.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:42:42.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code-plugins/0 2026-03-08T22:42:42.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 ' 2026-03-08T22:42:42.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:42:42.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:42:42.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:42:42.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code-plugins/0' 2026-03-08T22:42:42.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code-plugins/0/journal' 2026-03-08T22:42:42.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:42:42.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:42:42.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code-plugins' 2026-03-08T22:42:42.203 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:42:42.203 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:42.203 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:42.203 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:42.203 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:42.203 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:42.203 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:42.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' 2026-03-08T22:42:42.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:42:42.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:42:42.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:42:42.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code-plugins/$name.log' 2026-03-08T22:42:42.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code-plugins/$name.pid' 2026-03-08T22:42:42.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:42:42.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:42:42.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:42:42.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:42:42.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:42:42.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_erasure_code_plugins=shec_sse4 2026-03-08T22:42:42.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code-plugins/0 2026-03-08T22:42:42.205 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:42:42.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=7e275e37-e055-49a1-8a6e-e0d310882b91 2026-03-08T22:42:42.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 7e275e37-e055-49a1-8a6e-e0d310882b91' 2026-03-08T22:42:42.205 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 7e275e37-e055-49a1-8a6e-e0d310882b91 2026-03-08T22:42:42.206 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:42:42.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBi+61pYB4dDRAAdDh71u57n2ceFxJrl3uh2A== 2026-03-08T22:42:42.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBi+61pYB4dDRAAdDh71u57n2ceFxJrl3uh2A=="}' 2026-03-08T22:42:42.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 7e275e37-e055-49a1-8a6e-e0d310882b91 -i td/test-erasure-code-plugins/0/new.json 2026-03-08T22:42:42.340 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:42:42.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code-plugins/0/new.json 2026-03-08T22:42:42.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_erasure_code_plugins=shec_sse4 --mkfs --key AQBi+61pYB4dDRAAdDh71u57n2ceFxJrl3uh2A== --osd-uuid 7e275e37-e055-49a1-8a6e-e0d310882b91 2026-03-08T22:42:42.375 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:42.375+0000 7f73efc12780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:42.375 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:42.377+0000 7f73efc12780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:42.380 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:42.381+0000 7f73efc12780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:42.380 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:42.382+0000 7f73efc12780 -1 bdev(0x55d0ce22dc00 td/test-erasure-code-plugins/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:42:42.380 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:42.382+0000 7f73efc12780 -1 bluestore(td/test-erasure-code-plugins/0) _read_fsid unparsable uuid 2026-03-08T22:42:44.490 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code-plugins/0/keyring 2026-03-08T22:42:44.490 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:42:44.490 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:42:44.491 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:42:44.491 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code-plugins/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:42:44.633 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:42:44.634 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:42:44.634 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=728c4f63-a414-408c-84f0-672605d9b467 --auth-supported=none --mon-host=127.0.0.1:17110 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code-plugins/0 --osd-journal=td/test-erasure-code-plugins/0/journal --chdir= --run-dir=td/test-erasure-code-plugins '--admin-socket=/tmp/ceph-asok.50927/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code-plugins/$name.log' '--pid-file=td/test-erasure-code-plugins/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_erasure_code_plugins=shec_sse4 2026-03-08T22:42:44.634 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:42:44.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:42:44.637 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:42:44.651 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:44.653+0000 7f070520d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:44.653 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:44.655+0000 7f070520d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:44.655 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:44.656+0000 7f070520d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:44.779 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:42:44.779 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:42:44.779 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:42:44.779 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:42:44.779 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:42:44.779 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:44.779 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:42:44.779 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:42:44.779 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:44.779 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:45.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:45.729 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:45.731+0000 7f070520d780 -1 Falling back to public interface 2026-03-08T22:42:46.020 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:46.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:46.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:42:46.021 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:42:46.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:46.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:46.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:46.592 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:46.594+0000 7f070520d780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:42:47.240 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:42:47.240 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:47.240 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:47.240 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:42:47.240 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:47.240 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:47.508 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:48.511 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:42:48.512 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:48.512 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:48.512 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:42:48.512 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:48.512 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:48.760 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/4291799360,v1:127.0.0.1:6803/4291799360] [v2:127.0.0.1:6804/4291799360,v1:127.0.0.1:6805/4291799360] exists,up 7e275e37-e055-49a1-8a6e-e0d310882b91 2026-03-08T22:42:48.760 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:42:48.760 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:42:48.760 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:42:48.761 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:49: TEST_preload_warning: get_asok_path osd.0 2026-03-08T22:42:48.761 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:48.761 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:48.761 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:48.761 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:48.761 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:48.761 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50927/ceph-osd.0.asok 2026-03-08T22:42:48.762 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:49: TEST_preload_warning: CEPH_ARGS= 2026-03-08T22:42:48.762 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:49: TEST_preload_warning: ceph --admin-daemon /tmp/ceph-asok.50927/ceph-osd.0.asok log flush 2026-03-08T22:42:48.813 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:42:48.819 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:50: TEST_preload_warning: grep 'WARNING: osd_erasure_code_plugins contains plugin shec_sse4' td/test-erasure-code-plugins/mon.a.log 2026-03-08T22:42:48.820 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:42:41.893+0000 7f891b010d80 0 WARNING: osd_erasure_code_plugins contains plugin shec_sse4 that is now deprecated. Please modify the value for osd_erasure_code_plugins to use shec instead. 2026-03-08T22:42:48.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:51: TEST_preload_warning: grep 'WARNING: osd_erasure_code_plugins contains plugin shec_sse4' td/test-erasure-code-plugins/osd.0.log 2026-03-08T22:42:48.822 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:42:45.732+0000 7f070520d780 0 WARNING: osd_erasure_code_plugins contains plugin shec_sse4 that is now deprecated. Please modify the value for osd_erasure_code_plugins to use shec instead. 2026-03-08T22:42:48.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:52: TEST_preload_warning: teardown td/test-erasure-code-plugins 2026-03-08T22:42:48.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:48.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:48.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:42:48.822 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:48.822 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:48.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:48.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:48.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:48.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:48.936 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:48.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:48.937 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:48.938 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:48.938 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:48.938 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:48.939 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:48.939 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:48.939 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:48.939 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:48.940 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:48.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:48.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:42:48.945 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:48.945 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:48.945 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:48.946 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:42:48.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:48.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:48.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code-plugins.sh:54: TEST_preload_warning: return 0 2026-03-08T22:42:48.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2377: main: code=0 2026-03-08T22:42:48.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2381: main: teardown td/test-erasure-code-plugins 0 2026-03-08T22:42:48.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code-plugins 2026-03-08T22:42:48.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs=0 2026-03-08T22:42:48.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code-plugins KILL 2026-03-08T22:42:48.948 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:48.948 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:48.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:48.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:48.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:48.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:48.951 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:48.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:48.952 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:48.953 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:48.953 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:48.954 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:48.954 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:48.954 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:48.955 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:48.955 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:48.956 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:48.957 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o 0 = 1 ']' 2026-03-08T22:42:48.957 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code-plugins 2026-03-08T22:42:48.958 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:48.958 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:48.958 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50927 2026-03-08T22:42:48.958 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50927 2026-03-08T22:42:48.959 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:48.959 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:48.959 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2382: main: return 0 2026-03-08T22:42:48.960 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-08T22:42:48.960 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-08T22:42:48.988 INFO:tasks.workunit:Running workunit erasure-code/test-erasure-code.sh... 2026-03-08T22:42:48.989 DEBUG:teuthology.orchestra.run.vm04:workunit test erasure-code/test-erasure-code.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh 2026-03-08T22:42:49.053 INFO:tasks.workunit.client.0.vm04.stderr:stty: 'standard input': Inappropriate ioctl for device 2026-03-08T22:42:49.057 INFO:tasks.workunit.client.0.vm04.stderr:+ PS4='${BASH_SOURCE[0]}:$LINENO: ${FUNCNAME[0]}: ' 2026-03-08T22:42:49.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: export PATH=.:/home/ubuntu/.local/bin:/home/ubuntu/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/sbin 2026-03-08T22:42:49.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: PATH=.:/home/ubuntu/.local/bin:/home/ubuntu/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/sbin 2026-03-08T22:42:49.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: export PYTHONWARNINGS=ignore 2026-03-08T22:42:49.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: PYTHONWARNINGS=ignore 2026-03-08T22:42:49.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: export CEPH_CONF=/dev/null 2026-03-08T22:42:49.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: CEPH_CONF=/dev/null 2026-03-08T22:42:49.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2373: main: unset CEPH_ARGS 2026-03-08T22:42:49.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2375: main: local code 2026-03-08T22:42:49.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2376: main: run td/test-erasure-code 2026-03-08T22:42:49.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:22: run: local dir=td/test-erasure-code 2026-03-08T22:42:49.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:23: run: shift 2026-03-08T22:42:49.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:25: run: export CEPH_MON=127.0.0.1:7101 2026-03-08T22:42:49.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:25: run: CEPH_MON=127.0.0.1:7101 2026-03-08T22:42:49.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:26: run: export CEPH_ARGS 2026-03-08T22:42:49.058 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:27: run: uuidgen 2026-03-08T22:42:49.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:27: run: CEPH_ARGS+='--fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none ' 2026-03-08T22:42:49.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:28: run: CEPH_ARGS+='--mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false' 2026-03-08T22:42:49.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:30: run: setup td/test-erasure-code 2026-03-08T22:42:49.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-code 2026-03-08T22:42:49.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-code 2026-03-08T22:42:49.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code 2026-03-08T22:42:49.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:49.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code KILL 2026-03-08T22:42:49.059 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:49.059 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:49.059 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:49.059 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:49.059 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:49.061 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:49.061 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:49.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:49.062 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:49.063 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:49.063 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:49.063 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:49.064 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:49.064 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:49.064 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:49.064 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:49.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:49.066 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:49.066 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code 2026-03-08T22:42:49.067 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:49.067 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:49.067 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:42:49.067 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.74377 2026-03-08T22:42:49.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:49.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:49.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-code 2026-03-08T22:42:49.069 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:42:49.069 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:49.069 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:42:49.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.74377 2026-03-08T22:42:49.071 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:42:49.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 1024 -le 1024 ']' 2026-03-08T22:42:49.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:136: setup: ulimit -n 4096 2026-03-08T22:42:49.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:42:49.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-code 1' TERM HUP INT 2026-03-08T22:42:49.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:31: run: run_mon td/test-erasure-code a 2026-03-08T22:42:49.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-code 2026-03-08T22:42:49.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:42:49.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:42:49.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:42:49.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-code/a 2026-03-08T22:42:49.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-code/a --run-dir=td/test-erasure-code 2026-03-08T22:42:49.095 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:42:49.095 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:49.095 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:49.095 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:49.095 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:49.095 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:42:49.096 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:42:49.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-code/a '--log-file=td/test-erasure-code/$name.log' '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-code/log --run-dir=td/test-erasure-code '--pid-file=td/test-erasure-code/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:42:49.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:42:49.124 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:42:49.124 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:49.124 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:49.124 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:42:49.125 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:49.125 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:49.125 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:49.125 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:49.125 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:49.125 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:42:49.125 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:42:49.125 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.74377/ceph-mon.a.asok 2026-03-08T22:42:49.125 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:49.125 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.74377/ceph-mon.a.asok config get fsid 2026-03-08T22:42:49.181 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:42:49.181 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:49.181 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:49.181 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:42:49.182 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:42:49.182 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:49.182 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:49.182 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:49.182 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:49.182 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:49.182 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:42:49.182 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.74377/ceph-mon.a.asok 2026-03-08T22:42:49.182 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:49.182 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.74377/ceph-mon.a.asok config get mon_host 2026-03-08T22:42:49.240 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:32: run: run_mgr td/test-erasure-code x 2026-03-08T22:42:49.240 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-code 2026-03-08T22:42:49.240 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:42:49.240 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:42:49.240 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:42:49.240 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-code/x 2026-03-08T22:42:49.240 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:42:49.367 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:42:49.367 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:49.367 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:49.367 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:49.367 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:49.367 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:42:49.367 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:42:49.367 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:49.368 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-code/x '--log-file=td/test-erasure-code/$name.log' '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --run-dir=td/test-erasure-code '--pid-file=td/test-erasure-code/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:49.390 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:34: run: get_asok_path mon.a 2026-03-08T22:42:49.390 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:49.390 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:49.390 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:49.390 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:49.390 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:42:49.390 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.74377/ceph-mon.a.asok 2026-03-08T22:42:49.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:34: run: CEPH_ARGS= 2026-03-08T22:42:49.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:34: run: ceph --admin-daemon /tmp/ceph-asok.74377/ceph-mon.a.asok log flush 2026-03-08T22:42:49.435 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:42:49.442 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:35: run: grep 'load: jerasure.*lrc' td/test-erasure-code/mon.a.log 2026-03-08T22:42:49.444 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:42:49.117+0000 7f49215cdd80 0 load: jerasure load: lrc 2026-03-08T22:42:49.444 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:36: run: seq 0 10 2026-03-08T22:42:49.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:36: run: for id in $(seq 0 10) 2026-03-08T22:42:49.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:37: run: run_osd td/test-erasure-code 0 2026-03-08T22:42:49.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code 2026-03-08T22:42:49.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:42:49.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:42:49.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:42:49.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code/0 2026-03-08T22:42:49.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false' 2026-03-08T22:42:49.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:42:49.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:42:49.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:42:49.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code/0' 2026-03-08T22:42:49.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code/0/journal' 2026-03-08T22:42:49.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:42:49.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:42:49.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code' 2026-03-08T22:42:49.446 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:42:49.446 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:49.446 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:49.446 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:49.446 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:49.446 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:42:49.446 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:42:49.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:42:49.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:42:49.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:42:49.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:42:49.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code/$name.log' 2026-03-08T22:42:49.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code/$name.pid' 2026-03-08T22:42:49.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:42:49.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:42:49.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:42:49.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:42:49.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:42:49.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:42:49.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code/0 2026-03-08T22:42:49.447 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:42:49.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=315dfaac-510a-4ed1-9710-3a3e93f2ffd1 2026-03-08T22:42:49.448 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 315dfaac-510a-4ed1-9710-3a3e93f2ffd1 2026-03-08T22:42:49.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 315dfaac-510a-4ed1-9710-3a3e93f2ffd1' 2026-03-08T22:42:49.448 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:42:49.463 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBp+61pklytGxAAXn18BI52Hb1bkZZ/exsDng== 2026-03-08T22:42:49.463 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBp+61pklytGxAAXn18BI52Hb1bkZZ/exsDng=="}' 2026-03-08T22:42:49.463 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 315dfaac-510a-4ed1-9710-3a3e93f2ffd1 -i td/test-erasure-code/0/new.json 2026-03-08T22:42:49.586 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:42:49.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code/0/new.json 2026-03-08T22:42:49.595 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/0 --osd-journal=td/test-erasure-code/0/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBp+61pklytGxAAXn18BI52Hb1bkZZ/exsDng== --osd-uuid 315dfaac-510a-4ed1-9710-3a3e93f2ffd1 2026-03-08T22:42:49.615 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:49.616+0000 7f2ddde0d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:49.616 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:49.618+0000 7f2ddde0d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:49.619 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:49.621+0000 7f2ddde0d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:49.619 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:49.621+0000 7f2ddde0d780 -1 bdev(0x55d2b1db5c00 td/test-erasure-code/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:42:49.619 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:49.621+0000 7f2ddde0d780 -1 bluestore(td/test-erasure-code/0) _read_fsid unparsable uuid 2026-03-08T22:42:51.728 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code/0/keyring 2026-03-08T22:42:51.728 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:42:51.729 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:42:51.729 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:42:51.729 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:42:51.879 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:42:51.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:42:51.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:42:51.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:42:51.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:42:51.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/0 --osd-journal=td/test-erasure-code/0/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:42:51.911 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:51.912+0000 7fe974790780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:51.926 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:51.928+0000 7fe974790780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:51.929 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:51.931+0000 7fe974790780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:52.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:42:52.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:42:52.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:42:52.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:42:52.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:42:52.035 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:42:52.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:52.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:42:52.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:52.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:52.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:52.485 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:52.487+0000 7fe974790780 -1 Falling back to public interface 2026-03-08T22:42:53.276 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:53.276 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:53.276 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:42:53.276 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:42:53.277 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:53.277 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:53.356 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:53.358+0000 7fe974790780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:42:53.523 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:54.526 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:42:54.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:54.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:54.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:42:54.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:54.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:54.811 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:55.813 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:42:55.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:55.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:55.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:42:55.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:55.814 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:56.042 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3976641981,v1:127.0.0.1:6803/3976641981] [v2:127.0.0.1:6804/3976641981,v1:127.0.0.1:6805/3976641981] exists,up 315dfaac-510a-4ed1-9710-3a3e93f2ffd1 2026-03-08T22:42:56.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:42:56.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:42:56.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:42:56.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:36: run: for id in $(seq 0 10) 2026-03-08T22:42:56.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:37: run: run_osd td/test-erasure-code 1 2026-03-08T22:42:56.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code 2026-03-08T22:42:56.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:42:56.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:42:56.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:42:56.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code/1 2026-03-08T22:42:56.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false' 2026-03-08T22:42:56.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:42:56.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:42:56.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:42:56.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code/1' 2026-03-08T22:42:56.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code/1/journal' 2026-03-08T22:42:56.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:42:56.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:42:56.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code' 2026-03-08T22:42:56.044 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:42:56.044 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:56.044 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:56.044 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:56.044 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:56.044 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:42:56.044 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:42:56.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:42:56.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:42:56.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:42:56.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:42:56.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code/$name.log' 2026-03-08T22:42:56.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code/$name.pid' 2026-03-08T22:42:56.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:42:56.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:42:56.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:42:56.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:42:56.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:42:56.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:42:56.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code/1 2026-03-08T22:42:56.046 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:42:56.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=f6e527b1-644f-4b7a-90f8-fc7b6916006b 2026-03-08T22:42:56.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 f6e527b1-644f-4b7a-90f8-fc7b6916006b' 2026-03-08T22:42:56.047 INFO:tasks.workunit.client.0.vm04.stdout:add osd1 f6e527b1-644f-4b7a-90f8-fc7b6916006b 2026-03-08T22:42:56.047 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:42:56.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBw+61pzzv4AxAA8QwbkzFkBUFFct0hCETmmg== 2026-03-08T22:42:56.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBw+61pzzv4AxAA8QwbkzFkBUFFct0hCETmmg=="}' 2026-03-08T22:42:56.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new f6e527b1-644f-4b7a-90f8-fc7b6916006b -i td/test-erasure-code/1/new.json 2026-03-08T22:42:56.416 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:42:56.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code/1/new.json 2026-03-08T22:42:56.435 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/1 --osd-journal=td/test-erasure-code/1/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBw+61pzzv4AxAA8QwbkzFkBUFFct0hCETmmg== --osd-uuid f6e527b1-644f-4b7a-90f8-fc7b6916006b 2026-03-08T22:42:56.456 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:56.457+0000 7fae88410780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:56.460 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:56.460+0000 7fae88410780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:56.460 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:56.461+0000 7fae88410780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:56.461 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:56.462+0000 7fae88410780 -1 bdev(0x563b6a795c00 td/test-erasure-code/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:42:56.461 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:56.462+0000 7fae88410780 -1 bluestore(td/test-erasure-code/1) _read_fsid unparsable uuid 2026-03-08T22:42:58.622 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code/1/keyring 2026-03-08T22:42:58.623 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:42:58.623 INFO:tasks.workunit.client.0.vm04.stdout:adding osd1 key to auth repository 2026-03-08T22:42:58.623 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:42:58.623 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:42:58.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:42:58.933 INFO:tasks.workunit.client.0.vm04.stdout:start osd.1 2026-03-08T22:42:58.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/1 --osd-journal=td/test-erasure-code/1/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:42:58.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:42:58.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:42:58.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:42:58.953 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:58.954+0000 7fecc13d4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:58.960 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:58.962+0000 7fecc13d4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:58.962 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:58.963+0000 7fecc13d4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:59.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:42:59.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:42:59.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:42:59.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:42:59.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:42:59.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:59.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:42:59.171 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:42:59.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:59.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:42:59.401 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:59.790 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:42:59.792+0000 7fecc13d4780 -1 Falling back to public interface 2026-03-08T22:43:00.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:00.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:00.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:43:00.403 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:43:00.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:00.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:43:00.642 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:00.895 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:00.897+0000 7fecc13d4780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:43:01.645 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:43:01.645 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:01.645 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:01.645 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:43:01.645 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:01.645 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:43:01.884 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:02.886 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:43:02.886 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:02.886 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:02.886 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:43:02.886 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:02.886 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:43:03.114 INFO:tasks.workunit.client.0.vm04.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/3766217364,v1:127.0.0.1:6811/3766217364] [v2:127.0.0.1:6812/3766217364,v1:127.0.0.1:6813/3766217364] exists,up f6e527b1-644f-4b7a-90f8-fc7b6916006b 2026-03-08T22:43:03.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:43:03.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:43:03.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:43:03.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:36: run: for id in $(seq 0 10) 2026-03-08T22:43:03.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:37: run: run_osd td/test-erasure-code 2 2026-03-08T22:43:03.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code 2026-03-08T22:43:03.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:43:03.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:43:03.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:43:03.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code/2 2026-03-08T22:43:03.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false' 2026-03-08T22:43:03.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:43:03.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:43:03.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:43:03.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code/2' 2026-03-08T22:43:03.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code/2/journal' 2026-03-08T22:43:03.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:43:03.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:43:03.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code' 2026-03-08T22:43:03.115 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:43:03.115 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:43:03.115 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:43:03.115 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:43:03.115 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:43:03.115 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:43:03.115 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:43:03.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:43:03.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:43:03.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:43:03.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:43:03.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code/$name.log' 2026-03-08T22:43:03.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code/$name.pid' 2026-03-08T22:43:03.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:43:03.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:43:03.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:43:03.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:43:03.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:43:03.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:43:03.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code/2 2026-03-08T22:43:03.117 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:43:03.118 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=45238ce9-ea0e-42d3-a52e-255e37addf26 2026-03-08T22:43:03.118 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 45238ce9-ea0e-42d3-a52e-255e37addf26' 2026-03-08T22:43:03.118 INFO:tasks.workunit.client.0.vm04.stdout:add osd2 45238ce9-ea0e-42d3-a52e-255e37addf26 2026-03-08T22:43:03.119 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:43:03.135 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQB3+61pMDorCBAAVhbieVi39+9HxVPpekPeeg== 2026-03-08T22:43:03.136 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQB3+61pMDorCBAAVhbieVi39+9HxVPpekPeeg=="}' 2026-03-08T22:43:03.136 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 45238ce9-ea0e-42d3-a52e-255e37addf26 -i td/test-erasure-code/2/new.json 2026-03-08T22:43:03.366 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:43:03.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code/2/new.json 2026-03-08T22:43:03.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/2 --osd-journal=td/test-erasure-code/2/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQB3+61pMDorCBAAVhbieVi39+9HxVPpekPeeg== --osd-uuid 45238ce9-ea0e-42d3-a52e-255e37addf26 2026-03-08T22:43:03.398 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:03.400+0000 7f25aa80c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:03.400 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:03.402+0000 7f25aa80c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:03.402 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:03.403+0000 7f25aa80c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:03.403 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:03.404+0000 7f25aa80c780 -1 bdev(0x558bb5f73c00 td/test-erasure-code/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:43:03.403 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:03.404+0000 7f25aa80c780 -1 bluestore(td/test-erasure-code/2) _read_fsid unparsable uuid 2026-03-08T22:43:05.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code/2/keyring 2026-03-08T22:43:05.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:43:05.797 INFO:tasks.workunit.client.0.vm04.stdout:adding osd2 key to auth repository 2026-03-08T22:43:05.797 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:43:05.797 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:43:06.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:43:06.101 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T22:43:06.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/2 --osd-journal=td/test-erasure-code/2/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:43:06.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:43:06.102 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:43:06.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:43:06.120 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:06.121+0000 7f204121e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:06.126 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:06.128+0000 7f204121e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:06.128 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:06.129+0000 7f204121e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:06.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:43:06.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:43:06.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:43:06.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:43:06.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:43:06.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:06.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:43:06.340 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:43:06.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:06.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:43:06.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:06.938 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:06.940+0000 7f204121e780 -1 Falling back to public interface 2026-03-08T22:43:07.567 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:43:07.567 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:07.567 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:07.567 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:43:07.568 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:07.568 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:43:07.800 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:07.816 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:07.818+0000 7f204121e780 -1 osd.2 0 log_to_monitors true 2026-03-08T22:43:08.802 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:43:08.802 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:08.802 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:08.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:43:08.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:08.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:43:09.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:09.643 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:09.644+0000 7f203c414640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T22:43:10.059 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:43:10.059 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:10.059 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:10.059 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:43:10.059 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:10.059 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:43:10.282 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/1760177345,v1:127.0.0.1:6819/1760177345] [v2:127.0.0.1:6820/1760177345,v1:127.0.0.1:6821/1760177345] exists,up 45238ce9-ea0e-42d3-a52e-255e37addf26 2026-03-08T22:43:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:43:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:43:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:43:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:36: run: for id in $(seq 0 10) 2026-03-08T22:43:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:37: run: run_osd td/test-erasure-code 3 2026-03-08T22:43:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code 2026-03-08T22:43:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:43:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T22:43:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:43:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code/3 2026-03-08T22:43:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false' 2026-03-08T22:43:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:43:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:43:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:43:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code/3' 2026-03-08T22:43:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code/3/journal' 2026-03-08T22:43:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:43:10.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:43:10.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code' 2026-03-08T22:43:10.284 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:43:10.284 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:43:10.284 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:43:10.284 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:43:10.284 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:43:10.284 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:43:10.285 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:43:10.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:43:10.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:43:10.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:43:10.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:43:10.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code/$name.log' 2026-03-08T22:43:10.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code/$name.pid' 2026-03-08T22:43:10.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:43:10.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:43:10.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:43:10.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:43:10.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:43:10.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:43:10.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code/3 2026-03-08T22:43:10.287 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:43:10.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=7f037384-bb57-41b7-b6c3-50e879a92b0a 2026-03-08T22:43:10.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 7f037384-bb57-41b7-b6c3-50e879a92b0a' 2026-03-08T22:43:10.288 INFO:tasks.workunit.client.0.vm04.stdout:add osd3 7f037384-bb57-41b7-b6c3-50e879a92b0a 2026-03-08T22:43:10.289 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:43:10.305 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQB++61p/TVMEhAA+u9EdQYLaWynWkOL0Mtk3Q== 2026-03-08T22:43:10.305 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQB++61p/TVMEhAA+u9EdQYLaWynWkOL0Mtk3Q=="}' 2026-03-08T22:43:10.305 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 7f037384-bb57-41b7-b6c3-50e879a92b0a -i td/test-erasure-code/3/new.json 2026-03-08T22:43:10.564 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:43:10.576 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code/3/new.json 2026-03-08T22:43:10.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/3 --osd-journal=td/test-erasure-code/3/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQB++61p/TVMEhAA+u9EdQYLaWynWkOL0Mtk3Q== --osd-uuid 7f037384-bb57-41b7-b6c3-50e879a92b0a 2026-03-08T22:43:10.597 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:10.599+0000 7eff33811780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:10.599 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:10.601+0000 7eff33811780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:10.600 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:10.602+0000 7eff33811780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:10.601 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:10.602+0000 7eff33811780 -1 bdev(0x55a8b31bdc00 td/test-erasure-code/3/block) open stat got: (1) Operation not permitted 2026-03-08T22:43:10.601 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:10.602+0000 7eff33811780 -1 bluestore(td/test-erasure-code/3) _read_fsid unparsable uuid 2026-03-08T22:43:12.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code/3/keyring 2026-03-08T22:43:12.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:43:12.735 INFO:tasks.workunit.client.0.vm04.stdout:adding osd3 key to auth repository 2026-03-08T22:43:12.735 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T22:43:12.735 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:43:13.028 INFO:tasks.workunit.client.0.vm04.stdout:start osd.3 2026-03-08T22:43:13.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T22:43:13.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/3 --osd-journal=td/test-erasure-code/3/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:43:13.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:43:13.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:43:13.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:43:13.047 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:13.048+0000 7ff24ad64780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:13.059 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:13.061+0000 7ff24ad64780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:13.060 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:13.062+0000 7ff24ad64780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:13.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T22:43:13.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:43:13.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:43:13.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:43:13.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:43:13.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:13.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:43:13.270 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:43:13.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:13.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:43:13.504 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:14.398 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:14.399+0000 7ff24ad64780 -1 Falling back to public interface 2026-03-08T22:43:14.505 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:14.505 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:14.505 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:43:14.505 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:43:14.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:14.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:43:14.726 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:15.727 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:43:15.727 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:15.727 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:15.727 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:43:15.728 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:15.728 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:43:15.979 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:16.027 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:16.029+0000 7ff24ad64780 -1 osd.3 0 log_to_monitors true 2026-03-08T22:43:16.980 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:43:16.980 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:16.980 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:16.980 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:43:16.981 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:43:16.981 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:17.238 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:18.241 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-08T22:43:18.241 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:18.241 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:18.241 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:43:18.241 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:18.241 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:43:18.474 INFO:tasks.workunit.client.0.vm04.stdout:osd.3 up in weight 1 up_from 20 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/3638101593,v1:127.0.0.1:6827/3638101593] [v2:127.0.0.1:6828/3638101593,v1:127.0.0.1:6829/3638101593] exists,up 7f037384-bb57-41b7-b6c3-50e879a92b0a 2026-03-08T22:43:18.474 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:36: run: for id in $(seq 0 10) 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:37: run: run_osd td/test-erasure-code 4 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=4 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code/4 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false' 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code/4' 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code/4/journal' 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code' 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:43:18.475 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:43:18.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:43:18.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:43:18.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:43:18.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:43:18.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code/$name.log' 2026-03-08T22:43:18.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code/$name.pid' 2026-03-08T22:43:18.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:43:18.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:43:18.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:43:18.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:43:18.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:43:18.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:43:18.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code/4 2026-03-08T22:43:18.477 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:43:18.478 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=e8b60ba7-0adb-4d7a-8362-9b994ba417d7 2026-03-08T22:43:18.478 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd4 e8b60ba7-0adb-4d7a-8362-9b994ba417d7' 2026-03-08T22:43:18.478 INFO:tasks.workunit.client.0.vm04.stdout:add osd4 e8b60ba7-0adb-4d7a-8362-9b994ba417d7 2026-03-08T22:43:18.479 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:43:18.494 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCG+61pTjOHHRAACF3Ud+59SWyDKph6oYh48A== 2026-03-08T22:43:18.494 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCG+61pTjOHHRAACF3Ud+59SWyDKph6oYh48A=="}' 2026-03-08T22:43:18.494 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new e8b60ba7-0adb-4d7a-8362-9b994ba417d7 -i td/test-erasure-code/4/new.json 2026-03-08T22:43:18.721 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-08T22:43:18.732 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code/4/new.json 2026-03-08T22:43:18.733 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 4 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/4 --osd-journal=td/test-erasure-code/4/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCG+61pTjOHHRAACF3Ud+59SWyDKph6oYh48A== --osd-uuid e8b60ba7-0adb-4d7a-8362-9b994ba417d7 2026-03-08T22:43:18.752 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:18.753+0000 7fad8c615780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:18.754 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:18.756+0000 7fad8c615780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:18.755 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:18.757+0000 7fad8c615780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:18.756 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:18.757+0000 7fad8c615780 -1 bdev(0x55a41e575c00 td/test-erasure-code/4/block) open stat got: (1) Operation not permitted 2026-03-08T22:43:18.756 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:18.757+0000 7fad8c615780 -1 bluestore(td/test-erasure-code/4) _read_fsid unparsable uuid 2026-03-08T22:43:21.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code/4/keyring 2026-03-08T22:43:21.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:43:21.678 INFO:tasks.workunit.client.0.vm04.stdout:adding osd4 key to auth repository 2026-03-08T22:43:21.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd4 key to auth repository 2026-03-08T22:43:21.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code/4/keyring auth add osd.4 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:43:21.974 INFO:tasks.workunit.client.0.vm04.stdout:start osd.4 2026-03-08T22:43:21.974 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.4 2026-03-08T22:43:21.974 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 4 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/4 --osd-journal=td/test-erasure-code/4/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:43:21.975 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:43:21.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:43:21.978 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:43:21.993 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:21.994+0000 7f5c28953780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:22.002 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:22.004+0000 7f5c28953780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:22.004 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:22.005+0000 7f5c28953780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:22.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 4 2026-03-08T22:43:22.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:43:22.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=4 2026-03-08T22:43:22.196 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:43:22.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:43:22.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:43:22.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:22.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:43:22.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:22.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:43:22.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:22.565 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:22.566+0000 7f5c28953780 -1 Falling back to public interface 2026-03-08T22:43:23.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:23.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:23.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:43:23.429 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:43:23.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:23.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:43:23.650 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:23.932 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:23.933+0000 7f5c28953780 -1 osd.4 0 log_to_monitors true 2026-03-08T22:43:24.652 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:43:24.652 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:24.652 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:24.652 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:43:24.652 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:43:24.652 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:24.920 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:25.922 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:43:25.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:25.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:25.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:43:25.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:25.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:43:26.287 INFO:tasks.workunit.client.0.vm04.stdout:osd.4 up in weight 1 up_from 25 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6834/2434475578,v1:127.0.0.1:6835/2434475578] [v2:127.0.0.1:6836/2434475578,v1:127.0.0.1:6837/2434475578] exists,up e8b60ba7-0adb-4d7a-8362-9b994ba417d7 2026-03-08T22:43:26.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:43:26.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:43:26.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:43:26.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:36: run: for id in $(seq 0 10) 2026-03-08T22:43:26.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:37: run: run_osd td/test-erasure-code 5 2026-03-08T22:43:26.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code 2026-03-08T22:43:26.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:43:26.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=5 2026-03-08T22:43:26.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:43:26.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code/5 2026-03-08T22:43:26.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false' 2026-03-08T22:43:26.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:43:26.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:43:26.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:43:26.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code/5' 2026-03-08T22:43:26.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code/5/journal' 2026-03-08T22:43:26.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:43:26.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:43:26.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code' 2026-03-08T22:43:26.293 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:43:26.293 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:43:26.293 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:43:26.293 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:43:26.293 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:43:26.293 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:43:26.293 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:43:26.293 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:43:26.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:43:26.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:43:26.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:43:26.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code/$name.log' 2026-03-08T22:43:26.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code/$name.pid' 2026-03-08T22:43:26.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:43:26.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:43:26.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:43:26.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:43:26.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:43:26.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:43:26.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code/5 2026-03-08T22:43:26.294 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:43:26.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=90e018b2-83fd-4a55-9054-b797e1abdd19 2026-03-08T22:43:26.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd5 90e018b2-83fd-4a55-9054-b797e1abdd19' 2026-03-08T22:43:26.294 INFO:tasks.workunit.client.0.vm04.stdout:add osd5 90e018b2-83fd-4a55-9054-b797e1abdd19 2026-03-08T22:43:26.294 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:43:26.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCO+61pLY2FEhAAw9ScQPCialELWYFZmTZsEw== 2026-03-08T22:43:26.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCO+61pLY2FEhAAw9ScQPCialELWYFZmTZsEw=="}' 2026-03-08T22:43:26.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 90e018b2-83fd-4a55-9054-b797e1abdd19 -i td/test-erasure-code/5/new.json 2026-03-08T22:43:26.550 INFO:tasks.workunit.client.0.vm04.stdout:5 2026-03-08T22:43:26.561 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code/5/new.json 2026-03-08T22:43:26.562 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 5 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/5 --osd-journal=td/test-erasure-code/5/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCO+61pLY2FEhAAw9ScQPCialELWYFZmTZsEw== --osd-uuid 90e018b2-83fd-4a55-9054-b797e1abdd19 2026-03-08T22:43:26.585 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:26.586+0000 7fb7bbe0d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:26.587 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:26.588+0000 7fb7bbe0d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:26.588 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:26.589+0000 7fb7bbe0d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:26.588 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:26.590+0000 7fb7bbe0d780 -1 bdev(0x5648fe041c00 td/test-erasure-code/5/block) open stat got: (1) Operation not permitted 2026-03-08T22:43:26.588 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:26.590+0000 7fb7bbe0d780 -1 bluestore(td/test-erasure-code/5) _read_fsid unparsable uuid 2026-03-08T22:43:28.710 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code/5/keyring 2026-03-08T22:43:28.710 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:43:28.710 INFO:tasks.workunit.client.0.vm04.stdout:adding osd5 key to auth repository 2026-03-08T22:43:28.711 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd5 key to auth repository 2026-03-08T22:43:28.711 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code/5/keyring auth add osd.5 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:43:29.010 INFO:tasks.workunit.client.0.vm04.stdout:start osd.5 2026-03-08T22:43:29.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.5 2026-03-08T22:43:29.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 5 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/5 --osd-journal=td/test-erasure-code/5/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:43:29.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:43:29.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:43:29.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:43:29.029 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:29.030+0000 7f9679a15780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:29.036 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:29.037+0000 7f9679a15780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:29.037 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:29.039+0000 7f9679a15780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:29.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 5 2026-03-08T22:43:29.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:43:29.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=5 2026-03-08T22:43:29.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:43:29.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:43:29.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:29.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:43:29.246 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:43:29.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:29.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:43:29.480 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:30.105 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:30.106+0000 7f9679a15780 -1 Falling back to public interface 2026-03-08T22:43:30.481 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:30.482 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:30.482 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:43:30.482 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:43:30.483 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:30.483 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:43:30.702 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:30.951 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:30.953+0000 7f9679a15780 -1 osd.5 0 log_to_monitors true 2026-03-08T22:43:31.704 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:43:31.704 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:31.705 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:31.705 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:43:31.705 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:31.705 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:43:31.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:32.954 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:43:32.954 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:32.954 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:32.954 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:43:32.954 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:32.954 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:43:33.184 INFO:tasks.workunit.client.0.vm04.stdout:osd.5 up in weight 1 up_from 30 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6842/636248667,v1:127.0.0.1:6843/636248667] [v2:127.0.0.1:6844/636248667,v1:127.0.0.1:6845/636248667] exists,up 90e018b2-83fd-4a55-9054-b797e1abdd19 2026-03-08T22:43:33.184 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:43:33.184 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:36: run: for id in $(seq 0 10) 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:37: run: run_osd td/test-erasure-code 6 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=6 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code/6 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false' 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code/6' 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code/6/journal' 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code' 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:43:33.185 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:43:33.186 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:43:33.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:43:33.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:43:33.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:43:33.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:43:33.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code/$name.log' 2026-03-08T22:43:33.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code/$name.pid' 2026-03-08T22:43:33.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:43:33.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:43:33.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:43:33.187 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:43:33.187 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:43:33.187 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:43:33.187 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code/6 2026-03-08T22:43:33.188 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:43:33.188 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=1473e18f-5ee6-44b3-9c16-2f2200607177 2026-03-08T22:43:33.189 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd6 1473e18f-5ee6-44b3-9c16-2f2200607177' 2026-03-08T22:43:33.189 INFO:tasks.workunit.client.0.vm04.stdout:add osd6 1473e18f-5ee6-44b3-9c16-2f2200607177 2026-03-08T22:43:33.189 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:43:33.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCV+61plqEnDBAAFI8fKFj+pZ3VBP4YL4XfWA== 2026-03-08T22:43:33.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCV+61plqEnDBAAFI8fKFj+pZ3VBP4YL4XfWA=="}' 2026-03-08T22:43:33.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 1473e18f-5ee6-44b3-9c16-2f2200607177 -i td/test-erasure-code/6/new.json 2026-03-08T22:43:33.435 INFO:tasks.workunit.client.0.vm04.stdout:6 2026-03-08T22:43:33.444 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code/6/new.json 2026-03-08T22:43:33.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 6 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/6 --osd-journal=td/test-erasure-code/6/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCV+61plqEnDBAAFI8fKFj+pZ3VBP4YL4XfWA== --osd-uuid 1473e18f-5ee6-44b3-9c16-2f2200607177 2026-03-08T22:43:33.464 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:33.465+0000 7f01499ae780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:33.465 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:33.467+0000 7f01499ae780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:33.466 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:33.468+0000 7f01499ae780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:33.467 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:33.468+0000 7f01499ae780 -1 bdev(0x55d74363bc00 td/test-erasure-code/6/block) open stat got: (1) Operation not permitted 2026-03-08T22:43:33.467 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:33.468+0000 7f01499ae780 -1 bluestore(td/test-erasure-code/6) _read_fsid unparsable uuid 2026-03-08T22:43:35.592 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code/6/keyring 2026-03-08T22:43:35.592 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:43:35.592 INFO:tasks.workunit.client.0.vm04.stdout:adding osd6 key to auth repository 2026-03-08T22:43:35.592 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd6 key to auth repository 2026-03-08T22:43:35.592 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code/6/keyring auth add osd.6 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:43:35.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.6 2026-03-08T22:43:35.882 INFO:tasks.workunit.client.0.vm04.stdout:start osd.6 2026-03-08T22:43:35.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 6 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/6 --osd-journal=td/test-erasure-code/6/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:43:35.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:43:35.883 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:43:35.885 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:43:35.900 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:35.901+0000 7fb4b0210780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:35.903 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:35.904+0000 7fb4b0210780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:35.905 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:35.906+0000 7fb4b0210780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:36.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 6 2026-03-08T22:43:36.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:43:36.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=6 2026-03-08T22:43:36.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:43:36.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:43:36.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:36.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:43:36.130 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:43:36.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:43:36.134 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:36.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:37.217 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:37.218+0000 7fb4b0210780 -1 Falling back to public interface 2026-03-08T22:43:37.347 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:43:37.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:37.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:37.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:43:37.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:37.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:43:37.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:38.306 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:38.307+0000 7fb4b0210780 -1 osd.6 0 log_to_monitors true 2026-03-08T22:43:38.659 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:43:38.659 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:38.659 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:38.659 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:43:38.659 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:38.659 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:43:38.888 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:39.891 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:43:39.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:39.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:39.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:43:39.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:39.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:43:40.128 INFO:tasks.workunit.client.0.vm04.stdout:osd.6 up in weight 1 up_from 35 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6850/3691953412,v1:127.0.0.1:6851/3691953412] [v2:127.0.0.1:6852/3691953412,v1:127.0.0.1:6853/3691953412] exists,up 1473e18f-5ee6-44b3-9c16-2f2200607177 2026-03-08T22:43:40.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:43:40.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:43:40.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:43:40.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:36: run: for id in $(seq 0 10) 2026-03-08T22:43:40.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:37: run: run_osd td/test-erasure-code 7 2026-03-08T22:43:40.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code 2026-03-08T22:43:40.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:43:40.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=7 2026-03-08T22:43:40.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:43:40.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code/7 2026-03-08T22:43:40.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false' 2026-03-08T22:43:40.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:43:40.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:43:40.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:43:40.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code/7' 2026-03-08T22:43:40.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code/7/journal' 2026-03-08T22:43:40.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:43:40.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:43:40.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code' 2026-03-08T22:43:40.130 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:43:40.130 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:43:40.130 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:43:40.130 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:43:40.130 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:43:40.130 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:43:40.131 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:43:40.132 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:43:40.132 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:43:40.132 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:43:40.132 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:43:40.132 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code/$name.log' 2026-03-08T22:43:40.132 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code/$name.pid' 2026-03-08T22:43:40.132 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:43:40.132 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:43:40.132 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:43:40.132 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:43:40.132 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:43:40.132 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:43:40.132 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code/7 2026-03-08T22:43:40.133 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:43:40.134 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=b9a3485e-5913-489e-9b1a-989f1372609f 2026-03-08T22:43:40.134 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd7 b9a3485e-5913-489e-9b1a-989f1372609f' 2026-03-08T22:43:40.134 INFO:tasks.workunit.client.0.vm04.stdout:add osd7 b9a3485e-5913-489e-9b1a-989f1372609f 2026-03-08T22:43:40.135 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:43:40.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCc+61pu8T7CBAAyxRKu4A5t7ZQLiog3uEwDQ== 2026-03-08T22:43:40.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCc+61pu8T7CBAAyxRKu4A5t7ZQLiog3uEwDQ=="}' 2026-03-08T22:43:40.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new b9a3485e-5913-489e-9b1a-989f1372609f -i td/test-erasure-code/7/new.json 2026-03-08T22:43:40.373 INFO:tasks.workunit.client.0.vm04.stdout:7 2026-03-08T22:43:40.384 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code/7/new.json 2026-03-08T22:43:40.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 7 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/7 --osd-journal=td/test-erasure-code/7/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCc+61pu8T7CBAAyxRKu4A5t7ZQLiog3uEwDQ== --osd-uuid b9a3485e-5913-489e-9b1a-989f1372609f 2026-03-08T22:43:40.405 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:40.407+0000 7f1472e0d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:40.409 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:40.410+0000 7f1472e0d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:40.410 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:40.412+0000 7f1472e0d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:40.411 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:40.412+0000 7f1472e0d780 -1 bdev(0x5576c3e11c00 td/test-erasure-code/7/block) open stat got: (1) Operation not permitted 2026-03-08T22:43:40.411 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:40.412+0000 7f1472e0d780 -1 bluestore(td/test-erasure-code/7) _read_fsid unparsable uuid 2026-03-08T22:43:42.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code/7/keyring 2026-03-08T22:43:42.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:43:42.527 INFO:tasks.workunit.client.0.vm04.stdout:adding osd7 key to auth repository 2026-03-08T22:43:42.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd7 key to auth repository 2026-03-08T22:43:42.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code/7/keyring auth add osd.7 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:43:42.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.7 2026-03-08T22:43:42.823 INFO:tasks.workunit.client.0.vm04.stdout:start osd.7 2026-03-08T22:43:42.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 7 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/7 --osd-journal=td/test-erasure-code/7/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:43:42.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:43:42.824 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:43:42.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:43:42.844 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:42.843+0000 7fae3f33f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:42.845 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:42.846+0000 7fae3f33f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:42.847 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:42.847+0000 7fae3f33f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:43.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 7 2026-03-08T22:43:43.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:43:43.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=7 2026-03-08T22:43:43.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:43:43.050 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:43:43.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:43:43.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:43.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:43:43.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:43.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.7 up' 2026-03-08T22:43:43.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:43.403 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:43.404+0000 7fae3f33f780 -1 Falling back to public interface 2026-03-08T22:43:44.281 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:43:44.281 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:44.281 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:44.281 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:43:44.281 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:44.281 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.7 up' 2026-03-08T22:43:44.504 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:44.517 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:44.518+0000 7fae3f33f780 -1 osd.7 0 log_to_monitors true 2026-03-08T22:43:45.507 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:43:45.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:45.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:45.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:43:45.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:45.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.7 up' 2026-03-08T22:43:45.738 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:46.740 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:43:46.740 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:46.741 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:46.741 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:43:46.741 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:46.741 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.7 up' 2026-03-08T22:43:46.959 INFO:tasks.workunit.client.0.vm04.stdout:osd.7 up in weight 1 up_from 40 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6858/785090946,v1:127.0.0.1:6859/785090946] [v2:127.0.0.1:6860/785090946,v1:127.0.0.1:6861/785090946] exists,up b9a3485e-5913-489e-9b1a-989f1372609f 2026-03-08T22:43:46.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:43:46.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:43:46.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:43:46.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:36: run: for id in $(seq 0 10) 2026-03-08T22:43:46.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:37: run: run_osd td/test-erasure-code 8 2026-03-08T22:43:46.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code 2026-03-08T22:43:46.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:43:46.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=8 2026-03-08T22:43:46.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:43:46.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code/8 2026-03-08T22:43:46.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false' 2026-03-08T22:43:46.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:43:46.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:43:46.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:43:46.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code/8' 2026-03-08T22:43:46.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code/8/journal' 2026-03-08T22:43:46.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:43:46.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:43:46.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code' 2026-03-08T22:43:46.961 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:43:46.961 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:43:46.961 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:43:46.961 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:43:46.961 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:43:46.961 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:43:46.961 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:43:46.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:43:46.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:43:46.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:43:46.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:43:46.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code/$name.log' 2026-03-08T22:43:46.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code/$name.pid' 2026-03-08T22:43:46.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:43:46.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:43:46.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:43:46.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:43:46.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:43:46.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:43:46.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code/8 2026-03-08T22:43:46.964 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:43:46.965 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=5bd05371-2ba1-4405-989c-3e746c7134d2 2026-03-08T22:43:46.965 INFO:tasks.workunit.client.0.vm04.stdout:add osd8 5bd05371-2ba1-4405-989c-3e746c7134d2 2026-03-08T22:43:46.965 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd8 5bd05371-2ba1-4405-989c-3e746c7134d2' 2026-03-08T22:43:46.965 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:43:46.978 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCi+61pvW9rOhAAKzAqFxlNrcQr1fmz+AkhHw== 2026-03-08T22:43:46.978 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCi+61pvW9rOhAAKzAqFxlNrcQr1fmz+AkhHw=="}' 2026-03-08T22:43:46.978 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 5bd05371-2ba1-4405-989c-3e746c7134d2 -i td/test-erasure-code/8/new.json 2026-03-08T22:43:47.215 INFO:tasks.workunit.client.0.vm04.stdout:8 2026-03-08T22:43:47.225 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code/8/new.json 2026-03-08T22:43:47.226 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 8 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/8 --osd-journal=td/test-erasure-code/8/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCi+61pvW9rOhAAKzAqFxlNrcQr1fmz+AkhHw== --osd-uuid 5bd05371-2ba1-4405-989c-3e746c7134d2 2026-03-08T22:43:47.247 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:47.248+0000 7f3bdfd19780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:47.250 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:47.251+0000 7f3bdfd19780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:47.251 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:47.252+0000 7f3bdfd19780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:47.252 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:47.253+0000 7f3bdfd19780 -1 bdev(0x55c2eddcfc00 td/test-erasure-code/8/block) open stat got: (1) Operation not permitted 2026-03-08T22:43:47.252 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:47.253+0000 7f3bdfd19780 -1 bluestore(td/test-erasure-code/8) _read_fsid unparsable uuid 2026-03-08T22:43:49.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code/8/keyring 2026-03-08T22:43:49.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:43:49.396 INFO:tasks.workunit.client.0.vm04.stdout:adding osd8 key to auth repository 2026-03-08T22:43:49.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd8 key to auth repository 2026-03-08T22:43:49.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code/8/keyring auth add osd.8 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:43:49.685 INFO:tasks.workunit.client.0.vm04.stdout:start osd.8 2026-03-08T22:43:49.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.8 2026-03-08T22:43:49.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 8 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/8 --osd-journal=td/test-erasure-code/8/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:43:49.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:43:49.686 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:43:49.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:43:49.704 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:49.705+0000 7f54946e2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:49.712 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:49.713+0000 7f54946e2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:49.713 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:49.715+0000 7f54946e2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:49.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 8 2026-03-08T22:43:49.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:43:49.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=8 2026-03-08T22:43:49.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:43:49.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:43:49.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:49.916 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:43:49.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:43:49.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:49.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.8 up' 2026-03-08T22:43:50.140 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:50.805 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:50.806+0000 7f54946e2780 -1 Falling back to public interface 2026-03-08T22:43:51.141 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:51.141 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:51.141 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:43:51.141 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:43:51.142 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:51.142 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.8 up' 2026-03-08T22:43:51.388 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:51.930 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:51.932+0000 7f54946e2780 -1 osd.8 0 log_to_monitors true 2026-03-08T22:43:52.389 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:52.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:52.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:43:52.390 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:43:52.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:52.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.8 up' 2026-03-08T22:43:52.659 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:53.631 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:53.632+0000 7f548fe81640 -1 osd.8 0 waiting for initial osdmap 2026-03-08T22:43:53.661 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:43:53.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:53.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:53.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:43:53.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:53.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.8 up' 2026-03-08T22:43:53.888 INFO:tasks.workunit.client.0.vm04.stdout:osd.8 up in weight 1 up_from 45 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6866/2372467522,v1:127.0.0.1:6867/2372467522] [v2:127.0.0.1:6868/2372467522,v1:127.0.0.1:6869/2372467522] exists,up 5bd05371-2ba1-4405-989c-3e746c7134d2 2026-03-08T22:43:53.888 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:43:53.888 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:43:53.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:43:53.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:36: run: for id in $(seq 0 10) 2026-03-08T22:43:53.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:37: run: run_osd td/test-erasure-code 9 2026-03-08T22:43:53.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code 2026-03-08T22:43:53.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:43:53.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=9 2026-03-08T22:43:53.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:43:53.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code/9 2026-03-08T22:43:53.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false' 2026-03-08T22:43:53.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:43:53.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:43:53.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:43:53.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code/9' 2026-03-08T22:43:53.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code/9/journal' 2026-03-08T22:43:53.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:43:53.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:43:53.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code' 2026-03-08T22:43:53.890 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:43:53.890 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:43:53.890 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:43:53.890 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:43:53.890 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:43:53.890 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:43:53.890 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:43:53.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:43:53.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:43:53.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:43:53.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:43:53.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code/$name.log' 2026-03-08T22:43:53.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code/$name.pid' 2026-03-08T22:43:53.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:43:53.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:43:53.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:43:53.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:43:53.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:43:53.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:43:53.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code/9 2026-03-08T22:43:53.892 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:43:53.892 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=e624fc0a-0f7d-4c72-92d5-27028e3149b6 2026-03-08T22:43:53.892 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd9 e624fc0a-0f7d-4c72-92d5-27028e3149b6' 2026-03-08T22:43:53.893 INFO:tasks.workunit.client.0.vm04.stdout:add osd9 e624fc0a-0f7d-4c72-92d5-27028e3149b6 2026-03-08T22:43:53.893 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:43:53.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCp+61pA4skNhAA+Bq6K7W9E/NDvjmCwsc4ig== 2026-03-08T22:43:53.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCp+61pA4skNhAA+Bq6K7W9E/NDvjmCwsc4ig=="}' 2026-03-08T22:43:53.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new e624fc0a-0f7d-4c72-92d5-27028e3149b6 -i td/test-erasure-code/9/new.json 2026-03-08T22:43:54.143 INFO:tasks.workunit.client.0.vm04.stdout:9 2026-03-08T22:43:54.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code/9/new.json 2026-03-08T22:43:54.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 9 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/9 --osd-journal=td/test-erasure-code/9/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCp+61pA4skNhAA+Bq6K7W9E/NDvjmCwsc4ig== --osd-uuid e624fc0a-0f7d-4c72-92d5-27028e3149b6 2026-03-08T22:43:54.179 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:54.181+0000 7f1451e1c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:54.181 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:54.183+0000 7f1451e1c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:54.184 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:54.185+0000 7f1451e1c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:54.185 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:54.186+0000 7f1451e1c780 -1 bdev(0x55fa53babc00 td/test-erasure-code/9/block) open stat got: (1) Operation not permitted 2026-03-08T22:43:54.185 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:54.186+0000 7f1451e1c780 -1 bluestore(td/test-erasure-code/9) _read_fsid unparsable uuid 2026-03-08T22:43:56.331 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code/9/keyring 2026-03-08T22:43:56.331 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:43:56.331 INFO:tasks.workunit.client.0.vm04.stdout:adding osd9 key to auth repository 2026-03-08T22:43:56.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd9 key to auth repository 2026-03-08T22:43:56.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code/9/keyring auth add osd.9 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:43:56.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.9 2026-03-08T22:43:56.635 INFO:tasks.workunit.client.0.vm04.stdout:start osd.9 2026-03-08T22:43:56.636 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 9 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/9 --osd-journal=td/test-erasure-code/9/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:43:56.636 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:43:56.637 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:43:56.639 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:43:56.660 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:56.661+0000 7fc2a1412780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:56.661 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:56.663+0000 7fc2a1412780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:56.663 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:56.664+0000 7fc2a1412780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:56.874 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 9 2026-03-08T22:43:56.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:43:56.875 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:43:56.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=9 2026-03-08T22:43:56.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:43:56.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:43:56.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:56.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:43:56.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:56.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T22:43:57.108 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:57.471 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:57.472+0000 7fc2a1412780 -1 Falling back to public interface 2026-03-08T22:43:58.111 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:43:58.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:58.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:58.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:43:58.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:58.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T22:43:58.325 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:58.327+0000 7fc2a1412780 -1 osd.9 0 log_to_monitors true 2026-03-08T22:43:58.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:43:59.348 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:43:59.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:43:59.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:43:59.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:43:59.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T22:43:59.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:43:59.413 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:59.414+0000 7fc29c55b640 -1 osd.9 0 waiting for initial osdmap 2026-03-08T22:43:59.584 INFO:tasks.workunit.client.0.vm04.stdout:osd.9 up in weight 1 up_from 50 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6874/2114803806,v1:127.0.0.1:6875/2114803806] [v2:127.0.0.1:6876/2114803806,v1:127.0.0.1:6877/2114803806] exists,up e624fc0a-0f7d-4c72-92d5-27028e3149b6 2026-03-08T22:43:59.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:43:59.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:43:59.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:43:59.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:36: run: for id in $(seq 0 10) 2026-03-08T22:43:59.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:37: run: run_osd td/test-erasure-code 10 2026-03-08T22:43:59.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-code 2026-03-08T22:43:59.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:43:59.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=10 2026-03-08T22:43:59.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:43:59.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-code/10 2026-03-08T22:43:59.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false' 2026-03-08T22:43:59.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:43:59.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:43:59.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:43:59.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-code/10' 2026-03-08T22:43:59.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-code/10/journal' 2026-03-08T22:43:59.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:43:59.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:43:59.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-code' 2026-03-08T22:43:59.585 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:43:59.585 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:43:59.585 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:43:59.585 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:43:59.585 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:43:59.585 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:43:59.586 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:43:59.586 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:43:59.586 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:43:59.586 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:43:59.586 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:43:59.587 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-code/$name.log' 2026-03-08T22:43:59.587 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-code/$name.pid' 2026-03-08T22:43:59.587 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:43:59.587 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:43:59.587 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:43:59.587 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:43:59.587 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:43:59.587 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:43:59.587 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-code/10 2026-03-08T22:43:59.588 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:43:59.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=2a67c6f7-b47a-45e8-8ed0-ca1d1f3e2f54 2026-03-08T22:43:59.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd10 2a67c6f7-b47a-45e8-8ed0-ca1d1f3e2f54' 2026-03-08T22:43:59.589 INFO:tasks.workunit.client.0.vm04.stdout:add osd10 2a67c6f7-b47a-45e8-8ed0-ca1d1f3e2f54 2026-03-08T22:43:59.589 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:43:59.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCv+61pjZ4IJBAAF6ddrTAfGjpE0fPpyiZz3A== 2026-03-08T22:43:59.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCv+61pjZ4IJBAAF6ddrTAfGjpE0fPpyiZz3A=="}' 2026-03-08T22:43:59.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 2a67c6f7-b47a-45e8-8ed0-ca1d1f3e2f54 -i td/test-erasure-code/10/new.json 2026-03-08T22:43:59.867 INFO:tasks.workunit.client.0.vm04.stdout:10 2026-03-08T22:43:59.878 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-code/10/new.json 2026-03-08T22:43:59.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 10 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/10 --osd-journal=td/test-erasure-code/10/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCv+61pjZ4IJBAAF6ddrTAfGjpE0fPpyiZz3A== --osd-uuid 2a67c6f7-b47a-45e8-8ed0-ca1d1f3e2f54 2026-03-08T22:43:59.897 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:59.899+0000 7faa3a2d9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:59.900 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:59.901+0000 7faa3a2d9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:59.901 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:59.903+0000 7faa3a2d9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:43:59.902 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:59.903+0000 7faa3a2d9780 -1 bdev(0x5612313dfc00 td/test-erasure-code/10/block) open stat got: (1) Operation not permitted 2026-03-08T22:43:59.902 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:43:59.903+0000 7faa3a2d9780 -1 bluestore(td/test-erasure-code/10) _read_fsid unparsable uuid 2026-03-08T22:44:02.036 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-code/10/keyring 2026-03-08T22:44:02.036 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:44:02.037 INFO:tasks.workunit.client.0.vm04.stdout:adding osd10 key to auth repository 2026-03-08T22:44:02.037 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd10 key to auth repository 2026-03-08T22:44:02.037 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-code/10/keyring auth add osd.10 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:44:02.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.10 2026-03-08T22:44:02.362 INFO:tasks.workunit.client.0.vm04.stdout:start osd.10 2026-03-08T22:44:02.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 10 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/10 --osd-journal=td/test-erasure-code/10/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:44:02.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:44:02.363 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:44:02.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:44:02.381 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:02.381+0000 7f7feee0d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:02.384 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:02.386+0000 7f7feee0d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:02.387 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:02.388+0000 7f7feee0d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:02.596 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 10 2026-03-08T22:44:02.597 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:44:02.597 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=10 2026-03-08T22:44:02.597 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:44:02.597 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:44:02.597 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:02.597 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:44:02.597 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:44:02.597 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:02.597 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.10 up' 2026-03-08T22:44:02.816 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:44:02.943 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:02.944+0000 7f7feee0d780 -1 Falling back to public interface 2026-03-08T22:44:03.818 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:44:03.818 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:44:03.818 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:03.818 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:44:03.818 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:03.818 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.10 up' 2026-03-08T22:44:04.052 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:44:04.304 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:04.306+0000 7f7feee0d780 -1 osd.10 0 log_to_monitors true 2026-03-08T22:44:05.055 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:44:05.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:44:05.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:05.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:44:05.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:05.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.10 up' 2026-03-08T22:44:05.346 INFO:tasks.workunit.client.0.vm04.stdout:osd.10 up in weight 1 up_from 55 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6882/147880667,v1:127.0.0.1:6883/147880667] [v2:127.0.0.1:6884/147880667,v1:127.0.0.1:6885/147880667] exists,up 2a67c6f7-b47a-45e8-8ed0-ca1d1f3e2f54 2026-03-08T22:44:05.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:44:05.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:44:05.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:44:05.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:39: run: create_rbd_pool 2026-03-08T22:44:05.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T22:44:05.586 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd' does not exist 2026-03-08T22:44:05.597 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T22:44:05.597 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:44:05.847 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd' already exists 2026-03-08T22:44:05.858 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:44:06.860 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T22:44:07.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:40: run: wait_for_clean 2026-03-08T22:44:07.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:44:07.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:44:07.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:44:07.159 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:44:07.160 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:44:07.160 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:44:07.160 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:44:07.160 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:44:07.160 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:44:07.239 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:44:07.239 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:44:07.239 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:44:07.239 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:44:07.239 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:44:07.240 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:44:07.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:44:07.475 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:44:07.475 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:44:07.475 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:44:07.475 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:44:07.475 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:44:07.475 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:44:07.475 INFO:tasks.workunit.client.0.vm04.stderr:7 2026-03-08T22:44:07.475 INFO:tasks.workunit.client.0.vm04.stderr:8 2026-03-08T22:44:07.475 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:44:07.475 INFO:tasks.workunit.client.0.vm04.stderr:10' 2026-03-08T22:44:07.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:44:07.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:07.475 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:44:07.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836497 2026-03-08T22:44:07.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836497 2026-03-08T22:44:07.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836497' 2026-03-08T22:44:07.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:07.559 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:44:07.637 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672975 2026-03-08T22:44:07.637 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672975 2026-03-08T22:44:07.637 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836497 1-42949672975' 2026-03-08T22:44:07.637 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:07.638 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:44:07.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509454 2026-03-08T22:44:07.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509454 2026-03-08T22:44:07.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836497 1-42949672975 2-64424509454' 2026-03-08T22:44:07.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:07.714 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:44:07.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=85899345932 2026-03-08T22:44:07.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 85899345932 2026-03-08T22:44:07.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836497 1-42949672975 2-64424509454 3-85899345932' 2026-03-08T22:44:07.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:07.791 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:44:07.866 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182411 2026-03-08T22:44:07.866 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182411 2026-03-08T22:44:07.866 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836497 1-42949672975 2-64424509454 3-85899345932 4-107374182411' 2026-03-08T22:44:07.866 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:07.867 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:44:07.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=128849018889 2026-03-08T22:44:07.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 128849018889 2026-03-08T22:44:07.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836497 1-42949672975 2-64424509454 3-85899345932 4-107374182411 5-128849018889' 2026-03-08T22:44:07.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:07.952 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T22:44:08.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855368 2026-03-08T22:44:08.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855368 2026-03-08T22:44:08.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836497 1-42949672975 2-64424509454 3-85899345932 4-107374182411 5-128849018889 6-150323855368' 2026-03-08T22:44:08.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:08.029 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T22:44:08.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691847 2026-03-08T22:44:08.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691847 2026-03-08T22:44:08.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836497 1-42949672975 2-64424509454 3-85899345932 4-107374182411 5-128849018889 6-150323855368 7-171798691847' 2026-03-08T22:44:08.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:08.101 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T22:44:08.174 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528325 2026-03-08T22:44:08.174 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528325 2026-03-08T22:44:08.174 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836497 1-42949672975 2-64424509454 3-85899345932 4-107374182411 5-128849018889 6-150323855368 7-171798691847 8-193273528325' 2026-03-08T22:44:08.174 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:08.174 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T22:44:08.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=214748364804 2026-03-08T22:44:08.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 214748364804 2026-03-08T22:44:08.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836497 1-42949672975 2-64424509454 3-85899345932 4-107374182411 5-128849018889 6-150323855368 7-171798691847 8-193273528325 9-214748364804' 2026-03-08T22:44:08.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:08.247 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.10 flush_pg_stats 2026-03-08T22:44:08.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=236223201283 2026-03-08T22:44:08.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 236223201283 2026-03-08T22:44:08.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836497 1-42949672975 2-64424509454 3-85899345932 4-107374182411 5-128849018889 6-150323855368 7-171798691847 8-193273528325 9-214748364804 10-236223201283' 2026-03-08T22:44:08.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:08.331 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836497 2026-03-08T22:44:08.331 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:08.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:44:08.332 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836497 2026-03-08T22:44:08.332 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:08.333 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836497 2026-03-08T22:44:08.333 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836497' 2026-03-08T22:44:08.333 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 21474836497 2026-03-08T22:44:08.333 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:44:08.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836497 -lt 21474836497 2026-03-08T22:44:08.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:08.556 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672975 2026-03-08T22:44:08.556 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:08.557 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:44:08.558 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672975 2026-03-08T22:44:08.558 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:08.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672975 2026-03-08T22:44:08.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672975' 2026-03-08T22:44:08.559 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 42949672975 2026-03-08T22:44:08.559 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:44:08.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672975 -lt 42949672975 2026-03-08T22:44:08.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:08.782 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509454 2026-03-08T22:44:08.782 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:08.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:44:08.783 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509454 2026-03-08T22:44:08.783 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:08.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509454 2026-03-08T22:44:08.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509454' 2026-03-08T22:44:08.784 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 64424509454 2026-03-08T22:44:08.784 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:44:09.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509454 -lt 64424509454 2026-03-08T22:44:09.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:09.010 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-85899345932 2026-03-08T22:44:09.011 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:09.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:44:09.012 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-85899345932 2026-03-08T22:44:09.012 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:09.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=85899345932 2026-03-08T22:44:09.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 85899345932' 2026-03-08T22:44:09.014 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 85899345932 2026-03-08T22:44:09.014 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:44:09.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 85899345932 -lt 85899345932 2026-03-08T22:44:09.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:09.238 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182411 2026-03-08T22:44:09.238 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:09.239 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:44:09.239 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182411 2026-03-08T22:44:09.239 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:09.240 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182411 2026-03-08T22:44:09.240 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182411' 2026-03-08T22:44:09.240 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.4 seq 107374182411 2026-03-08T22:44:09.241 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:44:09.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182411 -lt 107374182411 2026-03-08T22:44:09.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:09.462 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-128849018889 2026-03-08T22:44:09.462 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:09.463 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:44:09.464 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-128849018889 2026-03-08T22:44:09.464 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:09.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=128849018889 2026-03-08T22:44:09.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 128849018889' 2026-03-08T22:44:09.465 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.5 seq 128849018889 2026-03-08T22:44:09.466 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:44:09.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 128849018887 -lt 128849018889 2026-03-08T22:44:09.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:44:10.694 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:44:10.694 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:44:10.921 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 128849018890 -lt 128849018889 2026-03-08T22:44:10.921 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:10.922 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-150323855368 2026-03-08T22:44:10.922 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:10.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T22:44:10.923 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-150323855368 2026-03-08T22:44:10.923 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:10.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855368 2026-03-08T22:44:10.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 150323855368' 2026-03-08T22:44:10.924 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.6 seq 150323855368 2026-03-08T22:44:10.925 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T22:44:11.174 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855368 -lt 150323855368 2026-03-08T22:44:11.174 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:11.174 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691847 2026-03-08T22:44:11.174 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:11.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T22:44:11.176 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691847 2026-03-08T22:44:11.176 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:11.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691847 2026-03-08T22:44:11.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691847' 2026-03-08T22:44:11.177 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.7 seq 171798691847 2026-03-08T22:44:11.177 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T22:44:11.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691847 -lt 171798691847 2026-03-08T22:44:11.423 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:11.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528325 2026-03-08T22:44:11.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:11.424 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T22:44:11.425 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528325 2026-03-08T22:44:11.425 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:11.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528325 2026-03-08T22:44:11.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528325' 2026-03-08T22:44:11.427 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.8 seq 193273528325 2026-03-08T22:44:11.427 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T22:44:11.664 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528326 -lt 193273528325 2026-03-08T22:44:11.664 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:11.664 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-214748364804 2026-03-08T22:44:11.664 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:11.665 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T22:44:11.666 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-214748364804 2026-03-08T22:44:11.666 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:11.667 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=214748364804 2026-03-08T22:44:11.667 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.9 seq 214748364804 2026-03-08T22:44:11.667 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 214748364804' 2026-03-08T22:44:11.668 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T22:44:11.917 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 214748364805 -lt 214748364804 2026-03-08T22:44:11.917 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:11.917 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 10-236223201283 2026-03-08T22:44:11.918 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:11.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=10 2026-03-08T22:44:11.919 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 10-236223201283 2026-03-08T22:44:11.919 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:11.920 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=236223201283 2026-03-08T22:44:11.920 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.10 seq 236223201283 2026-03-08T22:44:11.921 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.10 seq 236223201283' 2026-03-08T22:44:11.921 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 10 2026-03-08T22:44:12.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 236223201284 -lt 236223201283 2026-03-08T22:44:12.168 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:44:12.168 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:12.168 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:12.480 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T22:44:12.480 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:44:12.480 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:44:12.480 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:44:12.480 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:44:12.480 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:44:12.480 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:44:12.481 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:44:12.702 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T22:44:12.703 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:44:12.703 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:12.703 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:13.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T22:44:13.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:44:13.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:44:13.013 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:42: run: get_asok_path osd.0 2026-03-08T22:44:13.013 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:44:13.013 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:44:13.014 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:44:13.014 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:44:13.014 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:44:13.014 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.74377/ceph-osd.0.asok 2026-03-08T22:44:13.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:42: run: CEPH_ARGS= 2026-03-08T22:44:13.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:42: run: ceph --admin-daemon /tmp/ceph-asok.74377/ceph-osd.0.asok log flush 2026-03-08T22:44:13.075 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:44:13.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:43: run: grep 'load: jerasure.*lrc' td/test-erasure-code/osd.0.log 2026-03-08T22:44:13.084 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:42:52.490+0000 7fe974790780 0 load: jerasure load: lrc 2026-03-08T22:44:13.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:44: run: create_erasure_coded_pool ecpool 2026-03-08T22:44:13.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:56: create_erasure_coded_pool: local poolname=ecpool 2026-03-08T22:44:13.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:58: create_erasure_coded_pool: ceph osd erasure-code-profile set myprofile crush-failure-domain=osd 2026-03-08T22:44:13.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:60: create_erasure_coded_pool: create_pool ecpool 12 12 erasure myprofile 2026-03-08T22:44:13.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create ecpool 12 12 erasure myprofile 2026-03-08T22:44:13.821 INFO:tasks.workunit.client.0.vm04.stderr:pool 'ecpool' already exists 2026-03-08T22:44:13.833 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:44:14.834 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:62: create_erasure_coded_pool: wait_for_clean 2026-03-08T22:44:14.834 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:44:14.834 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:44:14.834 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:44:14.834 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:44:14.834 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:44:14.835 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:44:14.835 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:44:14.835 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:44:14.835 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:44:14.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:44:14.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:44:14.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:44:14.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:44:14.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:44:14.910 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:44:15.139 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:44:15.139 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:44:15.139 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:44:15.139 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:44:15.139 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:44:15.139 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:44:15.139 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:44:15.139 INFO:tasks.workunit.client.0.vm04.stderr:7 2026-03-08T22:44:15.139 INFO:tasks.workunit.client.0.vm04.stderr:8 2026-03-08T22:44:15.139 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:44:15.139 INFO:tasks.workunit.client.0.vm04.stderr:10' 2026-03-08T22:44:15.139 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:44:15.139 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:15.139 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:44:15.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836500 2026-03-08T22:44:15.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836500 2026-03-08T22:44:15.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836500' 2026-03-08T22:44:15.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:15.220 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:44:15.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672979 2026-03-08T22:44:15.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672979 2026-03-08T22:44:15.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836500 1-42949672979' 2026-03-08T22:44:15.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:15.306 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:44:15.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509458 2026-03-08T22:44:15.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509458 2026-03-08T22:44:15.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836500 1-42949672979 2-64424509458' 2026-03-08T22:44:15.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:15.385 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:44:15.463 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=85899345936 2026-03-08T22:44:15.463 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 85899345936 2026-03-08T22:44:15.463 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836500 1-42949672979 2-64424509458 3-85899345936' 2026-03-08T22:44:15.463 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:15.463 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:44:15.542 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182414 2026-03-08T22:44:15.542 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182414 2026-03-08T22:44:15.542 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836500 1-42949672979 2-64424509458 3-85899345936 4-107374182414' 2026-03-08T22:44:15.542 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:15.542 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:44:15.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=128849018893 2026-03-08T22:44:15.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 128849018893 2026-03-08T22:44:15.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836500 1-42949672979 2-64424509458 3-85899345936 4-107374182414 5-128849018893' 2026-03-08T22:44:15.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:15.624 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T22:44:15.703 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855372 2026-03-08T22:44:15.703 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855372 2026-03-08T22:44:15.703 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836500 1-42949672979 2-64424509458 3-85899345936 4-107374182414 5-128849018893 6-150323855372' 2026-03-08T22:44:15.703 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:15.703 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T22:44:15.779 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691850 2026-03-08T22:44:15.779 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691850 2026-03-08T22:44:15.779 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836500 1-42949672979 2-64424509458 3-85899345936 4-107374182414 5-128849018893 6-150323855372 7-171798691850' 2026-03-08T22:44:15.779 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:15.779 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T22:44:15.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528329 2026-03-08T22:44:15.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528329 2026-03-08T22:44:15.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836500 1-42949672979 2-64424509458 3-85899345936 4-107374182414 5-128849018893 6-150323855372 7-171798691850 8-193273528329' 2026-03-08T22:44:15.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:15.853 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T22:44:15.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=214748364808 2026-03-08T22:44:15.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 214748364808 2026-03-08T22:44:15.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836500 1-42949672979 2-64424509458 3-85899345936 4-107374182414 5-128849018893 6-150323855372 7-171798691850 8-193273528329 9-214748364808' 2026-03-08T22:44:15.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:15.933 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.10 flush_pg_stats 2026-03-08T22:44:16.006 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=236223201287 2026-03-08T22:44:16.006 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 236223201287 2026-03-08T22:44:16.006 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836500 1-42949672979 2-64424509458 3-85899345936 4-107374182414 5-128849018893 6-150323855372 7-171798691850 8-193273528329 9-214748364808 10-236223201287' 2026-03-08T22:44:16.006 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:16.006 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836500 2026-03-08T22:44:16.006 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:16.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:44:16.008 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836500 2026-03-08T22:44:16.008 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:16.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836500 2026-03-08T22:44:16.009 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 21474836500 2026-03-08T22:44:16.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836500' 2026-03-08T22:44:16.009 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:44:16.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836501 -lt 21474836500 2026-03-08T22:44:16.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:16.229 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672979 2026-03-08T22:44:16.229 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:16.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:44:16.230 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672979 2026-03-08T22:44:16.231 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:16.232 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672979 2026-03-08T22:44:16.232 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 42949672979 2026-03-08T22:44:16.232 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672979' 2026-03-08T22:44:16.232 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:44:16.458 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672979 -lt 42949672979 2026-03-08T22:44:16.458 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:16.459 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509458 2026-03-08T22:44:16.459 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:16.460 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:44:16.460 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509458 2026-03-08T22:44:16.460 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:16.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509458 2026-03-08T22:44:16.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509458' 2026-03-08T22:44:16.461 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 64424509458 2026-03-08T22:44:16.461 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:44:16.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509458 -lt 64424509458 2026-03-08T22:44:16.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:16.681 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-85899345936 2026-03-08T22:44:16.682 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:16.682 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:44:16.683 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-85899345936 2026-03-08T22:44:16.683 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:16.684 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=85899345936 2026-03-08T22:44:16.684 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 85899345936 2026-03-08T22:44:16.684 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 85899345936' 2026-03-08T22:44:16.684 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:44:16.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 85899345936 -lt 85899345936 2026-03-08T22:44:16.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:16.904 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182414 2026-03-08T22:44:16.904 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:16.905 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:44:16.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182414 2026-03-08T22:44:16.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:16.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182414 2026-03-08T22:44:16.907 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.4 seq 107374182414 2026-03-08T22:44:16.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182414' 2026-03-08T22:44:16.907 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:44:17.142 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182414 -lt 107374182414 2026-03-08T22:44:17.142 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:17.143 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-128849018893 2026-03-08T22:44:17.143 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:17.144 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:44:17.145 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-128849018893 2026-03-08T22:44:17.145 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:17.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=128849018893 2026-03-08T22:44:17.146 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.5 seq 128849018893 2026-03-08T22:44:17.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 128849018893' 2026-03-08T22:44:17.146 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:44:17.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 128849018893 -lt 128849018893 2026-03-08T22:44:17.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:17.379 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-150323855372 2026-03-08T22:44:17.379 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:17.380 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T22:44:17.381 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-150323855372 2026-03-08T22:44:17.381 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:17.382 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855372 2026-03-08T22:44:17.382 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.6 seq 150323855372 2026-03-08T22:44:17.382 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 150323855372' 2026-03-08T22:44:17.382 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T22:44:17.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855372 -lt 150323855372 2026-03-08T22:44:17.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:17.601 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691850 2026-03-08T22:44:17.601 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:17.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T22:44:17.603 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691850 2026-03-08T22:44:17.603 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:17.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691850 2026-03-08T22:44:17.604 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.7 seq 171798691850 2026-03-08T22:44:17.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691850' 2026-03-08T22:44:17.604 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T22:44:17.831 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691850 -lt 171798691850 2026-03-08T22:44:17.831 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:17.831 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528329 2026-03-08T22:44:17.831 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:17.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T22:44:17.833 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528329 2026-03-08T22:44:17.833 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:17.834 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528329 2026-03-08T22:44:17.834 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528329' 2026-03-08T22:44:17.834 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.8 seq 193273528329 2026-03-08T22:44:17.835 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T22:44:18.063 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528329 -lt 193273528329 2026-03-08T22:44:18.064 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:18.064 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-214748364808 2026-03-08T22:44:18.064 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:18.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T22:44:18.066 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-214748364808 2026-03-08T22:44:18.066 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:18.067 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=214748364808 2026-03-08T22:44:18.067 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 214748364808' 2026-03-08T22:44:18.068 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.9 seq 214748364808 2026-03-08T22:44:18.068 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T22:44:18.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 214748364808 -lt 214748364808 2026-03-08T22:44:18.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:18.291 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 10-236223201287 2026-03-08T22:44:18.291 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:18.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=10 2026-03-08T22:44:18.292 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 10-236223201287 2026-03-08T22:44:18.292 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:18.293 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=236223201287 2026-03-08T22:44:18.293 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.10 seq 236223201287' 2026-03-08T22:44:18.293 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.10 seq 236223201287 2026-03-08T22:44:18.293 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 10 2026-03-08T22:44:18.525 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 236223201287 -lt 236223201287 2026-03-08T22:44:18.525 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:44:18.525 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:18.525 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:18.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 16 == 0 2026-03-08T22:44:18.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:44:18.854 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:44:18.854 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:44:18.854 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:44:18.854 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:44:18.854 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:44:18.854 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:44:19.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=16 2026-03-08T22:44:19.101 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:44:19.101 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:19.101 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:19.413 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 16 = 16 2026-03-08T22:44:19.413 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:44:19.413 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:44:19.413 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:46: run: set 2026-03-08T22:44:19.413 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:46: run: sed -n -e 's/^\(TEST_[0-9a-z_]*\) .*/\1/p' 2026-03-08T22:44:19.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:46: run: local 'funcs=TEST_alignment_constraints 2026-03-08T22:44:19.415 INFO:tasks.workunit.client.0.vm04.stderr:TEST_chunk_mapping 2026-03-08T22:44:19.415 INFO:tasks.workunit.client.0.vm04.stderr:TEST_rados_put_get_isa 2026-03-08T22:44:19.415 INFO:tasks.workunit.client.0.vm04.stderr:TEST_rados_put_get_jerasure 2026-03-08T22:44:19.415 INFO:tasks.workunit.client.0.vm04.stderr:TEST_rados_put_get_lrc_advanced 2026-03-08T22:44:19.415 INFO:tasks.workunit.client.0.vm04.stderr:TEST_rados_put_get_lrc_kml 2026-03-08T22:44:19.415 INFO:tasks.workunit.client.0.vm04.stderr:TEST_rados_put_get_shec' 2026-03-08T22:44:19.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:47: run: for func in $funcs 2026-03-08T22:44:19.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:48: run: TEST_alignment_constraints td/test-erasure-code 2026-03-08T22:44:19.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:253: TEST_alignment_constraints: local payload=ABC 2026-03-08T22:44:19.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:254: TEST_alignment_constraints: echo ABC 2026-03-08T22:44:19.415 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:260: TEST_alignment_constraints: ceph-conf --show-config-value osd_pool_erasure_code_stripe_unit 2026-03-08T22:44:19.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:260: TEST_alignment_constraints: local stripe_unit=4096 2026-03-08T22:44:19.429 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:261: TEST_alignment_constraints: ceph osd erasure-code-profile get myprofile 2026-03-08T22:44:19.429 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:261: TEST_alignment_constraints: grep k= 2026-03-08T22:44:19.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:261: TEST_alignment_constraints: eval local k=2 2026-03-08T22:44:19.672 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:261: TEST_alignment_constraints: local k=2 2026-03-08T22:44:19.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:262: TEST_alignment_constraints: local block_size=8191 2026-03-08T22:44:19.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:263: TEST_alignment_constraints: dd if=/dev/zero of=td/test-erasure-code/ORIGINAL bs=8191 count=2 2026-03-08T22:44:19.673 INFO:tasks.workunit.client.0.vm04.stderr:2+0 records in 2026-03-08T22:44:19.673 INFO:tasks.workunit.client.0.vm04.stderr:2+0 records out 2026-03-08T22:44:19.673 INFO:tasks.workunit.client.0.vm04.stderr:16382 bytes (16 kB, 16 KiB) copied, 9.9646e-05 s, 164 MB/s 2026-03-08T22:44:19.673 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:264: TEST_alignment_constraints: rados --block-size=8191 --pool ecpool put UNALIGNED td/test-erasure-code/ORIGINAL 2026-03-08T22:44:19.696 INFO:tasks.workunit.client.0.vm04.stderr:INFO: op_size has been rounded to 8192 2026-03-08T22:44:19.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:266: TEST_alignment_constraints: rm td/test-erasure-code/ORIGINAL 2026-03-08T22:44:19.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:47: run: for func in $funcs 2026-03-08T22:44:19.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:48: run: TEST_chunk_mapping td/test-erasure-code 2026-03-08T22:44:19.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:304: TEST_chunk_mapping: local dir=td/test-erasure-code 2026-03-08T22:44:19.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:311: TEST_chunk_mapping: verify_chunk_mapping td/test-erasure-code ecpool 0 1 2026-03-08T22:44:19.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:281: verify_chunk_mapping: local dir=td/test-erasure-code 2026-03-08T22:44:19.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:282: verify_chunk_mapping: local poolname=ecpool 2026-03-08T22:44:19.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:283: verify_chunk_mapping: local first=0 2026-03-08T22:44:19.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:284: verify_chunk_mapping: local second=1 2026-03-08T22:44:19.716 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:286: verify_chunk_mapping: chunk_size 2026-03-08T22:44:19.716 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:270: chunk_size: ceph-conf --show-config-value osd_pool_erasure_code_stripe_unit 2026-03-08T22:44:19.730 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:270: chunk_size: echo 4096 2026-03-08T22:44:19.730 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:286: verify_chunk_mapping: printf '%*s' 4096 FIRSTecpool 2026-03-08T22:44:19.730 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:286: verify_chunk_mapping: chunk_size 2026-03-08T22:44:19.731 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:270: chunk_size: ceph-conf --show-config-value osd_pool_erasure_code_stripe_unit 2026-03-08T22:44:19.742 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:270: chunk_size: echo 4096 2026-03-08T22:44:19.742 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:286: verify_chunk_mapping: printf '%*s' 4096 SECONDecpool 2026-03-08T22:44:19.743 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:286: verify_chunk_mapping: local 'payload= FIRSTecpool SECONDecpool' 2026-03-08T22:44:19.743 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:287: verify_chunk_mapping: echo -n ' FIRSTecpool SECONDecpool' 2026-03-08T22:44:19.743 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:289: verify_chunk_mapping: rados --pool ecpool put SOMETHINGecpool td/test-erasure-code/ORIGINAL 2026-03-08T22:44:19.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:290: verify_chunk_mapping: rados --pool ecpool get SOMETHINGecpool td/test-erasure-code/COPY 2026-03-08T22:44:19.790 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:291: verify_chunk_mapping: get_osds ecpool SOMETHINGecpool 2026-03-08T22:44:19.790 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=ecpool 2026-03-08T22:44:19.790 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHINGecpool 2026-03-08T22:44:19.791 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map ecpool SOMETHINGecpool 2026-03-08T22:44:19.791 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:44:20.030 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:44:20.031 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:44:20.031 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:44:20.031 INFO:tasks.workunit.client.0.vm04.stderr:0' 2026-03-08T22:44:20.031 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 5 9 0 2026-03-08T22:44:20.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:291: verify_chunk_mapping: osds=('3' '5' '9' '0') 2026-03-08T22:44:20.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:291: verify_chunk_mapping: local -a osds 2026-03-08T22:44:20.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:292: verify_chunk_mapping: (( i = 0 )) 2026-03-08T22:44:20.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:292: verify_chunk_mapping: (( i < 4 )) 2026-03-08T22:44:20.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:293: verify_chunk_mapping: ceph daemon osd.3 flush_journal 2026-03-08T22:44:20.099 INFO:tasks.workunit.client.0.vm04.stderr:Can't get admin socket path: unable to get conf option admin_socket for osd: b"error parsing 'osd': expected string of the form TYPE.ID, valid types are: auth, mon, osd, mds, mgr, client\n" 2026-03-08T22:44:20.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:292: verify_chunk_mapping: (( i++ )) 2026-03-08T22:44:20.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:292: verify_chunk_mapping: (( i < 4 )) 2026-03-08T22:44:20.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:293: verify_chunk_mapping: ceph daemon osd.5 flush_journal 2026-03-08T22:44:20.169 INFO:tasks.workunit.client.0.vm04.stderr:Can't get admin socket path: unable to get conf option admin_socket for osd: b"error parsing 'osd': expected string of the form TYPE.ID, valid types are: auth, mon, osd, mds, mgr, client\n" 2026-03-08T22:44:20.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:292: verify_chunk_mapping: (( i++ )) 2026-03-08T22:44:20.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:292: verify_chunk_mapping: (( i < 4 )) 2026-03-08T22:44:20.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:293: verify_chunk_mapping: ceph daemon osd.9 flush_journal 2026-03-08T22:44:20.243 INFO:tasks.workunit.client.0.vm04.stderr:Can't get admin socket path: unable to get conf option admin_socket for osd: b"error parsing 'osd': expected string of the form TYPE.ID, valid types are: auth, mon, osd, mds, mgr, client\n" 2026-03-08T22:44:20.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:292: verify_chunk_mapping: (( i++ )) 2026-03-08T22:44:20.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:292: verify_chunk_mapping: (( i < 4 )) 2026-03-08T22:44:20.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:293: verify_chunk_mapping: ceph daemon osd.0 flush_journal 2026-03-08T22:44:20.317 INFO:tasks.workunit.client.0.vm04.stderr:Can't get admin socket path: unable to get conf option admin_socket for osd: b"error parsing 'osd': expected string of the form TYPE.ID, valid types are: auth, mon, osd, mds, mgr, client\n" 2026-03-08T22:44:20.320 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:292: verify_chunk_mapping: (( i++ )) 2026-03-08T22:44:20.320 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:292: verify_chunk_mapping: (( i < 4 )) 2026-03-08T22:44:20.320 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:295: verify_chunk_mapping: diff td/test-erasure-code/ORIGINAL td/test-erasure-code/COPY 2026-03-08T22:44:20.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:296: verify_chunk_mapping: rm td/test-erasure-code/COPY 2026-03-08T22:44:20.322 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:298: verify_chunk_mapping: get_osds ecpool SOMETHINGecpool 2026-03-08T22:44:20.322 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=ecpool 2026-03-08T22:44:20.322 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHINGecpool 2026-03-08T22:44:20.323 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map ecpool SOMETHINGecpool 2026-03-08T22:44:20.323 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:44:20.555 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:44:20.555 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:44:20.555 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:44:20.556 INFO:tasks.workunit.client.0.vm04.stderr:0' 2026-03-08T22:44:20.556 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 5 9 0 2026-03-08T22:44:20.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:298: verify_chunk_mapping: osds=('3' '5' '9' '0') 2026-03-08T22:44:20.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:298: verify_chunk_mapping: local -a osds 2026-03-08T22:44:20.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:299: verify_chunk_mapping: objectstore_tool td/test-erasure-code 3 SOMETHINGecpool get-bytes 2026-03-08T22:44:20.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-code 2026-03-08T22:44:20.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:44:20.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T22:44:20.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:44:20.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-code 3 SOMETHINGecpool get-bytes 2026-03-08T22:44:20.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-code 2026-03-08T22:44:20.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:44:20.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T22:44:20.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:44:20.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-code TERM osd.3 2026-03-08T22:44:20.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:299: verify_chunk_mapping: grep --quiet FIRSTecpool 2026-03-08T22:44:20.557 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:44:20.557 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:44:20.557 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:44:20.557 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:44:20.557 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:44:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:44:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-code 3 SOMETHINGecpool get-bytes 2026-03-08T22:44:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-code 2026-03-08T22:44:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:44:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T22:44:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:44:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-code/3 2026-03-08T22:44:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-code/3 SOMETHINGecpool get-bytes 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-code 3 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-code 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-code/3 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false' 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-code/3' 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-code/3/journal' 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-code' 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:44:21.447 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:44:21.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:44:21.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:44:21.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-code/$name.log' 2026-03-08T22:44:21.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-code/$name.pid' 2026-03-08T22:44:21.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:44:21.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:44:21.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:44:21.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:44:21.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:44:21.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:44:21.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-code/3 2026-03-08T22:44:21.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T22:44:21.449 INFO:tasks.workunit.client.0.vm04.stderr:start osd.3 2026-03-08T22:44:21.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/3 --osd-journal=td/test-erasure-code/3/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:44:21.449 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-code/3/whoami 2026-03-08T22:44:21.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T22:44:21.451 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:44:21.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:44:21.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:44:21.470 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:21.470+0000 7f77b514d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:21.479 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:21.480+0000 7f77b514d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:21.481 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:21.481+0000 7f77b514d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:21.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T22:44:21.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:44:21.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:44:21.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:44:21.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:44:21.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:21.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:44:21.696 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:44:21.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:21.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:44:21.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:44:22.293 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:22.294+0000 7f77b514d780 -1 Falling back to public interface 2026-03-08T22:44:22.942 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:44:22.942 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:22.942 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:44:22.942 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:44:22.943 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:22.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:44:23.183 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:23.184+0000 7f77b514d780 -1 osd.3 68 log_to_monitors true 2026-03-08T22:44:23.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:44:24.206 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:44:24.206 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:24.206 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:44:24.206 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:44:24.207 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:24.207 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:44:24.472 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:44:25.474 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:44:25.474 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:25.474 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:44:25.474 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:44:25.474 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:25.474 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:44:25.707 INFO:tasks.workunit.client.0.vm04.stderr:osd.3 up in weight 1 up_from 72 up_thru 72 down_at 69 last_clean_interval [20,68) [v2:127.0.0.1:6826/1276030646,v1:127.0.0.1:6827/1276030646] [v2:127.0.0.1:6828/1276030646,v1:127.0.0.1:6829/1276030646] exists,up 7f037384-bb57-41b7-b6c3-50e879a92b0a 2026-03-08T22:44:25.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:44:25.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:44:25.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:44:25.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:44:25.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:44:25.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:44:25.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:44:25.708 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:44:25.708 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:44:25.708 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:44:25.708 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:44:25.708 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:44:25.708 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:44:25.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:44:25.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:44:25.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:44:25.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:44:25.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:44:25.785 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:44:26.022 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:44:26.022 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:44:26.022 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:44:26.022 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:44:26.022 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:44:26.022 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:44:26.022 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:44:26.022 INFO:tasks.workunit.client.0.vm04.stderr:7 2026-03-08T22:44:26.022 INFO:tasks.workunit.client.0.vm04.stderr:8 2026-03-08T22:44:26.022 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:44:26.022 INFO:tasks.workunit.client.0.vm04.stderr:10' 2026-03-08T22:44:26.022 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:44:26.022 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:26.022 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:44:26.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836505 2026-03-08T22:44:26.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836505 2026-03-08T22:44:26.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505' 2026-03-08T22:44:26.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:26.100 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:44:26.182 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672983 2026-03-08T22:44:26.182 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672983 2026-03-08T22:44:26.182 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672983' 2026-03-08T22:44:26.182 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:26.182 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:44:26.256 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509462 2026-03-08T22:44:26.256 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509462 2026-03-08T22:44:26.256 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672983 2-64424509462' 2026-03-08T22:44:26.257 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:26.257 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:44:26.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=309237645315 2026-03-08T22:44:26.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 309237645315 2026-03-08T22:44:26.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672983 2-64424509462 3-309237645315' 2026-03-08T22:44:26.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:26.325 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:44:26.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182419 2026-03-08T22:44:26.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182419 2026-03-08T22:44:26.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672983 2-64424509462 3-309237645315 4-107374182419' 2026-03-08T22:44:26.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:26.403 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:44:26.480 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=128849018897 2026-03-08T22:44:26.480 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 128849018897 2026-03-08T22:44:26.480 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672983 2-64424509462 3-309237645315 4-107374182419 5-128849018897' 2026-03-08T22:44:26.480 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:26.480 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T22:44:26.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855376 2026-03-08T22:44:26.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855376 2026-03-08T22:44:26.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672983 2-64424509462 3-309237645315 4-107374182419 5-128849018897 6-150323855376' 2026-03-08T22:44:26.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:26.559 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T22:44:26.644 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691854 2026-03-08T22:44:26.644 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691854 2026-03-08T22:44:26.644 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672983 2-64424509462 3-309237645315 4-107374182419 5-128849018897 6-150323855376 7-171798691854' 2026-03-08T22:44:26.644 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:26.644 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T22:44:26.729 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528333 2026-03-08T22:44:26.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528333 2026-03-08T22:44:26.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672983 2-64424509462 3-309237645315 4-107374182419 5-128849018897 6-150323855376 7-171798691854 8-193273528333' 2026-03-08T22:44:26.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:26.730 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T22:44:26.820 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=214748364812 2026-03-08T22:44:26.820 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 214748364812 2026-03-08T22:44:26.820 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672983 2-64424509462 3-309237645315 4-107374182419 5-128849018897 6-150323855376 7-171798691854 8-193273528333 9-214748364812' 2026-03-08T22:44:26.820 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:26.820 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.10 flush_pg_stats 2026-03-08T22:44:26.901 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=236223201291 2026-03-08T22:44:26.901 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 236223201291 2026-03-08T22:44:26.901 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672983 2-64424509462 3-309237645315 4-107374182419 5-128849018897 6-150323855376 7-171798691854 8-193273528333 9-214748364812 10-236223201291' 2026-03-08T22:44:26.901 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:26.901 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836505 2026-03-08T22:44:26.901 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:26.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:44:26.902 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836505 2026-03-08T22:44:26.903 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:26.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836505 2026-03-08T22:44:26.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836505' 2026-03-08T22:44:26.903 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 21474836505 2026-03-08T22:44:26.904 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:44:27.142 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836503 -lt 21474836505 2026-03-08T22:44:27.142 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:44:28.143 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:44:28.143 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:44:28.370 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836505 -lt 21474836505 2026-03-08T22:44:28.370 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:28.370 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672983 2026-03-08T22:44:28.370 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:28.371 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:44:28.371 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672983 2026-03-08T22:44:28.371 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:28.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672983 2026-03-08T22:44:28.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672983' 2026-03-08T22:44:28.372 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 42949672983 2026-03-08T22:44:28.373 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:44:28.609 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672983 -lt 42949672983 2026-03-08T22:44:28.609 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:28.610 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509462 2026-03-08T22:44:28.610 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:28.611 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:44:28.612 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509462 2026-03-08T22:44:28.612 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:28.613 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509462 2026-03-08T22:44:28.613 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509462' 2026-03-08T22:44:28.613 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 64424509462 2026-03-08T22:44:28.613 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:44:28.873 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509462 -lt 64424509462 2026-03-08T22:44:28.874 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:28.874 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-309237645315 2026-03-08T22:44:28.874 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:28.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:44:28.875 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-309237645315 2026-03-08T22:44:28.875 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:28.876 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=309237645315 2026-03-08T22:44:28.876 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 309237645315' 2026-03-08T22:44:28.876 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 309237645315 2026-03-08T22:44:28.876 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:44:29.105 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 309237645315 -lt 309237645315 2026-03-08T22:44:29.105 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:29.105 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182419 2026-03-08T22:44:29.106 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:29.107 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:44:29.107 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182419 2026-03-08T22:44:29.107 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:29.108 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182419 2026-03-08T22:44:29.108 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182419' 2026-03-08T22:44:29.108 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.4 seq 107374182419 2026-03-08T22:44:29.108 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:44:29.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182419 -lt 107374182419 2026-03-08T22:44:29.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:29.374 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-128849018897 2026-03-08T22:44:29.374 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:29.376 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:44:29.376 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-128849018897 2026-03-08T22:44:29.376 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:29.377 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=128849018897 2026-03-08T22:44:29.377 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 128849018897' 2026-03-08T22:44:29.377 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.5 seq 128849018897 2026-03-08T22:44:29.377 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:44:29.608 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 128849018897 -lt 128849018897 2026-03-08T22:44:29.608 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:29.609 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-150323855376 2026-03-08T22:44:29.609 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:29.609 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T22:44:29.610 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-150323855376 2026-03-08T22:44:29.610 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:29.611 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855376 2026-03-08T22:44:29.611 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 150323855376' 2026-03-08T22:44:29.611 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.6 seq 150323855376 2026-03-08T22:44:29.611 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T22:44:29.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855376 -lt 150323855376 2026-03-08T22:44:29.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:29.842 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691854 2026-03-08T22:44:29.842 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:29.843 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T22:44:29.843 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691854 2026-03-08T22:44:29.844 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:29.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691854 2026-03-08T22:44:29.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691854' 2026-03-08T22:44:29.845 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.7 seq 171798691854 2026-03-08T22:44:29.845 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T22:44:30.073 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691855 -lt 171798691854 2026-03-08T22:44:30.073 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:30.073 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528333 2026-03-08T22:44:30.074 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:30.075 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T22:44:30.075 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528333 2026-03-08T22:44:30.075 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:30.076 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528333 2026-03-08T22:44:30.076 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528333' 2026-03-08T22:44:30.076 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.8 seq 193273528333 2026-03-08T22:44:30.076 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T22:44:30.318 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528334 -lt 193273528333 2026-03-08T22:44:30.318 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:30.318 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-214748364812 2026-03-08T22:44:30.318 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:30.320 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T22:44:30.320 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-214748364812 2026-03-08T22:44:30.320 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:30.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=214748364812 2026-03-08T22:44:30.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 214748364812' 2026-03-08T22:44:30.321 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.9 seq 214748364812 2026-03-08T22:44:30.321 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T22:44:30.549 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 214748364813 -lt 214748364812 2026-03-08T22:44:30.550 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:30.550 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 10-236223201291 2026-03-08T22:44:30.550 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:30.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=10 2026-03-08T22:44:30.552 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 10-236223201291 2026-03-08T22:44:30.552 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:30.553 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=236223201291 2026-03-08T22:44:30.553 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.10 seq 236223201291' 2026-03-08T22:44:30.553 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.10 seq 236223201291 2026-03-08T22:44:30.554 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 10 2026-03-08T22:44:30.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 236223201291 -lt 236223201291 2026-03-08T22:44:30.782 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:44:30.782 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:30.782 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:31.077 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 16 == 0 2026-03-08T22:44:31.077 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:44:31.077 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:44:31.077 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:44:31.077 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:44:31.077 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:44:31.077 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:44:31.077 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:44:31.322 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=16 2026-03-08T22:44:31.322 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:44:31.322 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:31.322 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:31.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 16 = 16 2026-03-08T22:44:31.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:44:31.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:44:31.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:300: verify_chunk_mapping: objectstore_tool td/test-erasure-code 5 SOMETHINGecpool get-bytes 2026-03-08T22:44:31.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:300: verify_chunk_mapping: grep --quiet SECONDecpool 2026-03-08T22:44:31.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-code 2026-03-08T22:44:31.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:44:31.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=5 2026-03-08T22:44:31.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:44:31.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-code 5 SOMETHINGecpool get-bytes 2026-03-08T22:44:31.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-code 2026-03-08T22:44:31.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:44:31.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=5 2026-03-08T22:44:31.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:44:31.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-code TERM osd.5 2026-03-08T22:44:31.647 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:44:31.647 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:44:31.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:44:31.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:44:31.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:44:31.753 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:44:31.753 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-code 5 SOMETHINGecpool get-bytes 2026-03-08T22:44:31.753 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-code 2026-03-08T22:44:31.753 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:44:31.753 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=5 2026-03-08T22:44:31.753 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:44:31.753 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-code/5 2026-03-08T22:44:31.753 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-code/5 SOMETHINGecpool get-bytes 2026-03-08T22:44:32.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-code 5 2026-03-08T22:44:32.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-code 2026-03-08T22:44:32.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:44:32.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=5 2026-03-08T22:44:32.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:44:32.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-code/5 2026-03-08T22:44:32.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false' 2026-03-08T22:44:32.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:44:32.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:44:32.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:44:32.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-code/5' 2026-03-08T22:44:32.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-code/5/journal' 2026-03-08T22:44:32.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:44:32.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:44:32.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-code' 2026-03-08T22:44:32.357 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:44:32.357 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:44:32.357 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:44:32.357 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:44:32.357 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:44:32.357 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:44:32.357 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:44:32.357 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:44:32.357 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:44:32.357 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-code/$name.log' 2026-03-08T22:44:32.357 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-code/$name.pid' 2026-03-08T22:44:32.357 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:44:32.357 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:44:32.357 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:44:32.357 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:44:32.357 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:44:32.357 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:44:32.357 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-code/5 2026-03-08T22:44:32.358 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.5 2026-03-08T22:44:32.358 INFO:tasks.workunit.client.0.vm04.stderr:start osd.5 2026-03-08T22:44:32.358 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 5 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/5 --osd-journal=td/test-erasure-code/5/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:44:32.359 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-code/5/whoami 2026-03-08T22:44:32.359 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 5 = 5 ']' 2026-03-08T22:44:32.360 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:44:32.360 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:44:32.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:44:32.376 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:32.377+0000 7fc447214780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:32.379 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:32.380+0000 7fc447214780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:32.381 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:32.382+0000 7fc447214780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:32.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 5 2026-03-08T22:44:32.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:44:32.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=5 2026-03-08T22:44:32.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:44:32.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:44:32.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:32.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:44:32.589 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:44:32.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:32.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:44:32.824 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:44:32.950 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:32.951+0000 7fc447214780 -1 Falling back to public interface 2026-03-08T22:44:33.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:44:33.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:33.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:44:33.828 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:44:33.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:44:33.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:33.850 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:33.852+0000 7fc447214780 -1 osd.5 74 log_to_monitors true 2026-03-08T22:44:34.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:44:35.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:44:35.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:35.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:44:35.069 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:44:35.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:35.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:44:35.454 INFO:tasks.workunit.client.0.vm04.stderr:osd.5 up in weight 1 up_from 78 up_thru 78 down_at 75 last_clean_interval [30,74) [v2:127.0.0.1:6842/2973492191,v1:127.0.0.1:6843/2973492191] [v2:127.0.0.1:6844/2973492191,v1:127.0.0.1:6845/2973492191] exists,up 90e018b2-83fd-4a55-9054-b797e1abdd19 2026-03-08T22:44:35.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:44:35.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:44:35.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:44:35.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:44:35.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:44:35.455 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:44:35.455 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:44:35.455 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:44:35.455 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:44:35.455 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:44:35.456 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:44:35.456 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:44:35.456 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:44:35.525 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:44:35.525 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:44:35.525 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:44:35.525 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:44:35.525 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:44:35.525 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:44:35.762 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:44:35.762 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:44:35.762 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:44:35.762 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:44:35.762 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:44:35.762 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:44:35.762 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:44:35.762 INFO:tasks.workunit.client.0.vm04.stderr:7 2026-03-08T22:44:35.762 INFO:tasks.workunit.client.0.vm04.stderr:8 2026-03-08T22:44:35.762 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:44:35.762 INFO:tasks.workunit.client.0.vm04.stderr:10' 2026-03-08T22:44:35.762 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:44:35.762 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:35.762 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:44:35.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836509 2026-03-08T22:44:35.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836509 2026-03-08T22:44:35.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836509' 2026-03-08T22:44:35.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:35.846 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:44:35.927 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672987 2026-03-08T22:44:35.927 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672987 2026-03-08T22:44:35.927 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836509 1-42949672987' 2026-03-08T22:44:35.927 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:35.928 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:44:36.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509466 2026-03-08T22:44:36.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509466 2026-03-08T22:44:36.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836509 1-42949672987 2-64424509466' 2026-03-08T22:44:36.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:36.007 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:44:36.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=309237645319 2026-03-08T22:44:36.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 309237645319 2026-03-08T22:44:36.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836509 1-42949672987 2-64424509466 3-309237645319' 2026-03-08T22:44:36.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:36.084 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:44:36.183 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182423 2026-03-08T22:44:36.183 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182423 2026-03-08T22:44:36.183 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836509 1-42949672987 2-64424509466 3-309237645319 4-107374182423' 2026-03-08T22:44:36.184 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:36.184 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:44:36.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=335007449091 2026-03-08T22:44:36.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 335007449091 2026-03-08T22:44:36.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836509 1-42949672987 2-64424509466 3-309237645319 4-107374182423 5-335007449091' 2026-03-08T22:44:36.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:36.271 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T22:44:36.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855380 2026-03-08T22:44:36.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855380 2026-03-08T22:44:36.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836509 1-42949672987 2-64424509466 3-309237645319 4-107374182423 5-335007449091 6-150323855380' 2026-03-08T22:44:36.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:36.353 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T22:44:36.437 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691858 2026-03-08T22:44:36.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691858 2026-03-08T22:44:36.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836509 1-42949672987 2-64424509466 3-309237645319 4-107374182423 5-335007449091 6-150323855380 7-171798691858' 2026-03-08T22:44:36.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:36.473 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T22:44:36.552 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528337 2026-03-08T22:44:36.552 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528337 2026-03-08T22:44:36.553 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836509 1-42949672987 2-64424509466 3-309237645319 4-107374182423 5-335007449091 6-150323855380 7-171798691858 8-193273528337' 2026-03-08T22:44:36.553 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:36.553 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T22:44:36.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=214748364816 2026-03-08T22:44:36.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 214748364816 2026-03-08T22:44:36.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836509 1-42949672987 2-64424509466 3-309237645319 4-107374182423 5-335007449091 6-150323855380 7-171798691858 8-193273528337 9-214748364816' 2026-03-08T22:44:36.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:36.630 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.10 flush_pg_stats 2026-03-08T22:44:36.723 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=236223201295 2026-03-08T22:44:36.723 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 236223201295 2026-03-08T22:44:36.723 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836509 1-42949672987 2-64424509466 3-309237645319 4-107374182423 5-335007449091 6-150323855380 7-171798691858 8-193273528337 9-214748364816 10-236223201295' 2026-03-08T22:44:36.723 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:36.723 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836509 2026-03-08T22:44:36.723 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:36.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:44:36.724 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836509 2026-03-08T22:44:36.724 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:36.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836509 2026-03-08T22:44:36.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836509' 2026-03-08T22:44:36.725 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 21474836509 2026-03-08T22:44:36.725 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:44:36.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836509 -lt 21474836509 2026-03-08T22:44:36.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:36.960 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672987 2026-03-08T22:44:36.960 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:36.961 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:44:36.961 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672987 2026-03-08T22:44:36.961 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:36.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672987 2026-03-08T22:44:36.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672987' 2026-03-08T22:44:36.962 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 42949672987 2026-03-08T22:44:36.962 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:44:37.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672985 -lt 42949672987 2026-03-08T22:44:37.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:44:38.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:44:38.197 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:44:38.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672987 -lt 42949672987 2026-03-08T22:44:38.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:38.418 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509466 2026-03-08T22:44:38.418 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:38.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:44:38.419 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509466 2026-03-08T22:44:38.420 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:38.420 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509466 2026-03-08T22:44:38.420 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509466' 2026-03-08T22:44:38.420 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 64424509466 2026-03-08T22:44:38.421 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:44:38.648 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509466 -lt 64424509466 2026-03-08T22:44:38.648 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:38.649 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-309237645319 2026-03-08T22:44:38.649 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:38.650 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:44:38.650 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-309237645319 2026-03-08T22:44:38.650 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:38.651 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=309237645319 2026-03-08T22:44:38.651 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 309237645319' 2026-03-08T22:44:38.651 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 309237645319 2026-03-08T22:44:38.651 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:44:38.888 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 309237645319 -lt 309237645319 2026-03-08T22:44:38.888 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:38.889 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182423 2026-03-08T22:44:38.889 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:38.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:44:38.890 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182423 2026-03-08T22:44:38.890 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:38.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182423 2026-03-08T22:44:38.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182423' 2026-03-08T22:44:38.891 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.4 seq 107374182423 2026-03-08T22:44:38.891 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:44:39.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182423 -lt 107374182423 2026-03-08T22:44:39.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:39.114 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-335007449091 2026-03-08T22:44:39.114 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:39.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:44:39.115 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-335007449091 2026-03-08T22:44:39.115 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:39.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=335007449091 2026-03-08T22:44:39.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 335007449091' 2026-03-08T22:44:39.116 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.5 seq 335007449091 2026-03-08T22:44:39.116 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:44:39.349 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 335007449091 -lt 335007449091 2026-03-08T22:44:39.349 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:39.349 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-150323855380 2026-03-08T22:44:39.350 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:39.351 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T22:44:39.351 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-150323855380 2026-03-08T22:44:39.352 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:39.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855380 2026-03-08T22:44:39.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 150323855380' 2026-03-08T22:44:39.353 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.6 seq 150323855380 2026-03-08T22:44:39.353 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T22:44:39.613 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855380 -lt 150323855380 2026-03-08T22:44:39.613 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:39.613 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691858 2026-03-08T22:44:39.613 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:39.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T22:44:39.615 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691858 2026-03-08T22:44:39.615 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:39.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691858 2026-03-08T22:44:39.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691858' 2026-03-08T22:44:39.616 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.7 seq 171798691858 2026-03-08T22:44:39.616 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T22:44:39.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691859 -lt 171798691858 2026-03-08T22:44:39.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:39.875 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528337 2026-03-08T22:44:39.876 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:39.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T22:44:39.877 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528337 2026-03-08T22:44:39.878 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:39.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528337 2026-03-08T22:44:39.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528337' 2026-03-08T22:44:39.906 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.8 seq 193273528337 2026-03-08T22:44:39.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T22:44:40.136 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528338 -lt 193273528337 2026-03-08T22:44:40.137 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:40.137 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-214748364816 2026-03-08T22:44:40.137 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:40.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T22:44:40.138 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-214748364816 2026-03-08T22:44:40.138 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:40.139 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=214748364816 2026-03-08T22:44:40.139 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 214748364816' 2026-03-08T22:44:40.139 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.9 seq 214748364816 2026-03-08T22:44:40.139 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T22:44:40.373 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 214748364817 -lt 214748364816 2026-03-08T22:44:40.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:40.374 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 10-236223201295 2026-03-08T22:44:40.374 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:40.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=10 2026-03-08T22:44:40.374 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 10-236223201295 2026-03-08T22:44:40.374 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:40.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=236223201295 2026-03-08T22:44:40.376 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.10 seq 236223201295' 2026-03-08T22:44:40.376 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.10 seq 236223201295 2026-03-08T22:44:40.376 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 10 2026-03-08T22:44:40.609 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 236223201295 -lt 236223201295 2026-03-08T22:44:40.609 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:44:40.609 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:40.609 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:40.921 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 16 == 0 2026-03-08T22:44:40.921 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:44:40.921 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:44:40.921 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:44:40.921 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:44:40.921 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:44:40.921 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:44:40.921 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:44:41.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=16 2026-03-08T22:44:41.145 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:44:41.146 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:41.146 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:41.460 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 16 = 16 2026-03-08T22:44:41.460 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:44:41.460 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:44:41.460 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:313: TEST_chunk_mapping: ceph osd erasure-code-profile set remap-profile plugin=lrc 'layers=[ [ "cDD", "" ] ]' mapping=_DD 'crush-steps=[ [ "choose", "osd", 0 ] ]' 2026-03-08T22:44:41.747 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:318: TEST_chunk_mapping: ceph osd erasure-code-profile get remap-profile 2026-03-08T22:44:41.984 INFO:tasks.workunit.client.0.vm04.stdout:crush-device-class= 2026-03-08T22:44:41.984 INFO:tasks.workunit.client.0.vm04.stdout:crush-failure-domain=host 2026-03-08T22:44:41.984 INFO:tasks.workunit.client.0.vm04.stdout:crush-num-failure-domains=0 2026-03-08T22:44:41.984 INFO:tasks.workunit.client.0.vm04.stdout:crush-osds-per-failure-domain=0 2026-03-08T22:44:41.984 INFO:tasks.workunit.client.0.vm04.stdout:crush-root=default 2026-03-08T22:44:41.984 INFO:tasks.workunit.client.0.vm04.stdout:crush-steps=[ [ "choose", "osd", 0 ] ] 2026-03-08T22:44:41.984 INFO:tasks.workunit.client.0.vm04.stdout:k=-1 2026-03-08T22:44:41.984 INFO:tasks.workunit.client.0.vm04.stdout:l=-1 2026-03-08T22:44:41.984 INFO:tasks.workunit.client.0.vm04.stdout:layers=[ [ "cDD", "" ] ] 2026-03-08T22:44:41.984 INFO:tasks.workunit.client.0.vm04.stdout:m=-1 2026-03-08T22:44:41.984 INFO:tasks.workunit.client.0.vm04.stdout:mapping=_DD 2026-03-08T22:44:41.984 INFO:tasks.workunit.client.0.vm04.stdout:plugin=lrc 2026-03-08T22:44:41.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:319: TEST_chunk_mapping: create_pool remap-pool 12 12 erasure remap-profile 2026-03-08T22:44:41.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create remap-pool 12 12 erasure remap-profile 2026-03-08T22:44:42.360 INFO:tasks.workunit.client.0.vm04.stderr:pool 'remap-pool' already exists 2026-03-08T22:44:42.373 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:44:43.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:327: TEST_chunk_mapping: verify_chunk_mapping td/test-erasure-code remap-pool 1 2 2026-03-08T22:44:43.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:281: verify_chunk_mapping: local dir=td/test-erasure-code 2026-03-08T22:44:43.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:282: verify_chunk_mapping: local poolname=remap-pool 2026-03-08T22:44:43.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:283: verify_chunk_mapping: local first=1 2026-03-08T22:44:43.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:284: verify_chunk_mapping: local second=2 2026-03-08T22:44:43.374 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:286: verify_chunk_mapping: chunk_size 2026-03-08T22:44:43.375 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:270: chunk_size: ceph-conf --show-config-value osd_pool_erasure_code_stripe_unit 2026-03-08T22:44:43.387 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:270: chunk_size: echo 4096 2026-03-08T22:44:43.387 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:286: verify_chunk_mapping: printf '%*s' 4096 FIRSTremap-pool 2026-03-08T22:44:43.387 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:286: verify_chunk_mapping: chunk_size 2026-03-08T22:44:43.387 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:270: chunk_size: ceph-conf --show-config-value osd_pool_erasure_code_stripe_unit 2026-03-08T22:44:43.399 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:270: chunk_size: echo 4096 2026-03-08T22:44:43.400 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:286: verify_chunk_mapping: printf '%*s' 4096 SECONDremap-pool 2026-03-08T22:44:43.400 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:286: verify_chunk_mapping: local 'payload= FIRSTremap-pool SECONDremap-pool' 2026-03-08T22:44:43.400 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:287: verify_chunk_mapping: echo -n ' FIRSTremap-pool SECONDremap-pool' 2026-03-08T22:44:43.400 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:289: verify_chunk_mapping: rados --pool remap-pool put SOMETHINGremap-pool td/test-erasure-code/ORIGINAL 2026-03-08T22:44:43.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:290: verify_chunk_mapping: rados --pool remap-pool get SOMETHINGremap-pool td/test-erasure-code/COPY 2026-03-08T22:44:43.451 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:291: verify_chunk_mapping: get_osds remap-pool SOMETHINGremap-pool 2026-03-08T22:44:43.452 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=remap-pool 2026-03-08T22:44:43.452 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHINGremap-pool 2026-03-08T22:44:43.452 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map remap-pool SOMETHINGremap-pool 2026-03-08T22:44:43.452 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:44:43.681 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=4 2026-03-08T22:44:43.681 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:44:43.681 INFO:tasks.workunit.client.0.vm04.stderr:9' 2026-03-08T22:44:43.681 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 4 6 9 2026-03-08T22:44:43.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:291: verify_chunk_mapping: osds=('4' '6' '9') 2026-03-08T22:44:43.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:291: verify_chunk_mapping: local -a osds 2026-03-08T22:44:43.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:292: verify_chunk_mapping: (( i = 0 )) 2026-03-08T22:44:43.682 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:292: verify_chunk_mapping: (( i < 3 )) 2026-03-08T22:44:43.682 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:293: verify_chunk_mapping: ceph daemon osd.4 flush_journal 2026-03-08T22:44:43.760 INFO:tasks.workunit.client.0.vm04.stderr:Can't get admin socket path: unable to get conf option admin_socket for osd: b"error parsing 'osd': expected string of the form TYPE.ID, valid types are: auth, mon, osd, mds, mgr, client\n" 2026-03-08T22:44:43.762 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:292: verify_chunk_mapping: (( i++ )) 2026-03-08T22:44:43.762 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:292: verify_chunk_mapping: (( i < 3 )) 2026-03-08T22:44:43.762 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:293: verify_chunk_mapping: ceph daemon osd.6 flush_journal 2026-03-08T22:44:43.831 INFO:tasks.workunit.client.0.vm04.stderr:Can't get admin socket path: unable to get conf option admin_socket for osd: b"error parsing 'osd': expected string of the form TYPE.ID, valid types are: auth, mon, osd, mds, mgr, client\n" 2026-03-08T22:44:43.833 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:292: verify_chunk_mapping: (( i++ )) 2026-03-08T22:44:43.833 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:292: verify_chunk_mapping: (( i < 3 )) 2026-03-08T22:44:43.833 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:293: verify_chunk_mapping: ceph daemon osd.9 flush_journal 2026-03-08T22:44:43.902 INFO:tasks.workunit.client.0.vm04.stderr:Can't get admin socket path: unable to get conf option admin_socket for osd: b"error parsing 'osd': expected string of the form TYPE.ID, valid types are: auth, mon, osd, mds, mgr, client\n" 2026-03-08T22:44:43.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:292: verify_chunk_mapping: (( i++ )) 2026-03-08T22:44:43.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:292: verify_chunk_mapping: (( i < 3 )) 2026-03-08T22:44:43.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:295: verify_chunk_mapping: diff td/test-erasure-code/ORIGINAL td/test-erasure-code/COPY 2026-03-08T22:44:43.905 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:296: verify_chunk_mapping: rm td/test-erasure-code/COPY 2026-03-08T22:44:43.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:298: verify_chunk_mapping: get_osds remap-pool SOMETHINGremap-pool 2026-03-08T22:44:43.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=remap-pool 2026-03-08T22:44:43.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHINGremap-pool 2026-03-08T22:44:43.906 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map remap-pool SOMETHINGremap-pool 2026-03-08T22:44:43.907 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:44:44.136 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=4 2026-03-08T22:44:44.136 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:44:44.136 INFO:tasks.workunit.client.0.vm04.stderr:9' 2026-03-08T22:44:44.136 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 4 6 9 2026-03-08T22:44:44.136 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:298: verify_chunk_mapping: osds=('4' '6' '9') 2026-03-08T22:44:44.136 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:298: verify_chunk_mapping: local -a osds 2026-03-08T22:44:44.136 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:299: verify_chunk_mapping: objectstore_tool td/test-erasure-code 6 SOMETHINGremap-pool get-bytes 2026-03-08T22:44:44.136 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-code 2026-03-08T22:44:44.136 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:44:44.136 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=6 2026-03-08T22:44:44.136 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:44:44.136 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-code 6 SOMETHINGremap-pool get-bytes 2026-03-08T22:44:44.137 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-code 2026-03-08T22:44:44.137 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:44:44.137 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=6 2026-03-08T22:44:44.137 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:44:44.137 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-code TERM osd.6 2026-03-08T22:44:44.137 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:299: verify_chunk_mapping: grep --quiet FIRSTremap-pool 2026-03-08T22:44:44.137 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:44:44.137 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:44:44.137 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:44:44.137 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:44:44.137 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:44:44.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:44:44.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-code 6 SOMETHINGremap-pool get-bytes 2026-03-08T22:44:44.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-code 2026-03-08T22:44:44.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:44:44.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=6 2026-03-08T22:44:44.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:44:44.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-code/6 2026-03-08T22:44:44.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-code/6 SOMETHINGremap-pool get-bytes 2026-03-08T22:44:44.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-code 6 2026-03-08T22:44:44.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-code 2026-03-08T22:44:44.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:44:44.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=6 2026-03-08T22:44:44.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:44:44.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-code/6 2026-03-08T22:44:44.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false' 2026-03-08T22:44:44.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:44:44.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:44:44.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:44:44.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-code/6' 2026-03-08T22:44:44.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-code/6/journal' 2026-03-08T22:44:44.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:44:44.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:44:44.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-code' 2026-03-08T22:44:44.846 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:44:44.846 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:44:44.846 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:44:44.846 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:44:44.846 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:44:44.846 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:44:44.846 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:44:44.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:44:44.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:44:44.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-code/$name.log' 2026-03-08T22:44:44.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-code/$name.pid' 2026-03-08T22:44:44.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:44:44.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:44:44.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:44:44.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:44:44.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:44:44.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:44:44.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-code/6 2026-03-08T22:44:44.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.6 2026-03-08T22:44:44.847 INFO:tasks.workunit.client.0.vm04.stderr:start osd.6 2026-03-08T22:44:44.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 6 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/6 --osd-journal=td/test-erasure-code/6/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:44:44.848 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-code/6/whoami 2026-03-08T22:44:44.848 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 6 = 6 ']' 2026-03-08T22:44:44.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:44:44.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:44:44.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:44:44.872 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:44.873+0000 7fd54020c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:44.878 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:44.880+0000 7fd54020c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:44.881 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:44.882+0000 7fd54020c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:45.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 6 2026-03-08T22:44:45.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:44:45.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=6 2026-03-08T22:44:45.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:44:45.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:44:45.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:45.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:44:45.084 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:44:45.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:45.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:44:45.315 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:44:45.686 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:45.687+0000 7fd54020c780 -1 Falling back to public interface 2026-03-08T22:44:46.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:44:46.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:46.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:44:46.317 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:44:46.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:46.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:44:46.543 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:44:46.827 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:46.829+0000 7fd54020c780 -1 osd.6 88 log_to_monitors true 2026-03-08T22:44:47.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:44:47.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:47.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:44:47.545 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:44:47.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:47.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:44:47.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:44:48.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:44:48.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:48.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:44:48.794 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:44:48.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:48.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:44:49.013 INFO:tasks.workunit.client.0.vm04.stderr:osd.6 up in weight 1 up_from 92 up_thru 92 down_at 89 last_clean_interval [35,88) [v2:127.0.0.1:6850/2214822817,v1:127.0.0.1:6851/2214822817] [v2:127.0.0.1:6852/2214822817,v1:127.0.0.1:6853/2214822817] exists,up 1473e18f-5ee6-44b3-9c16-2f2200607177 2026-03-08T22:44:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:44:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:44:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:44:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:44:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:44:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:44:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:44:49.014 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:44:49.014 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:44:49.014 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:44:49.014 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:44:49.014 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:44:49.014 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:44:49.086 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:44:49.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:44:49.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:44:49.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:44:49.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:44:49.087 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:44:49.314 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:44:49.314 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:44:49.314 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:44:49.314 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:44:49.314 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:44:49.314 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:44:49.314 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:44:49.314 INFO:tasks.workunit.client.0.vm04.stderr:7 2026-03-08T22:44:49.314 INFO:tasks.workunit.client.0.vm04.stderr:8 2026-03-08T22:44:49.314 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:44:49.314 INFO:tasks.workunit.client.0.vm04.stderr:10' 2026-03-08T22:44:49.314 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:44:49.314 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:49.314 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:44:49.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836513 2026-03-08T22:44:49.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836513 2026-03-08T22:44:49.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513' 2026-03-08T22:44:49.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:49.403 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:44:49.486 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672992 2026-03-08T22:44:49.486 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672992 2026-03-08T22:44:49.486 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-42949672992' 2026-03-08T22:44:49.486 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:49.486 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:44:49.562 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509470 2026-03-08T22:44:49.562 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509470 2026-03-08T22:44:49.562 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-42949672992 2-64424509470' 2026-03-08T22:44:49.562 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:49.562 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:44:49.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=309237645324 2026-03-08T22:44:49.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 309237645324 2026-03-08T22:44:49.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-42949672992 2-64424509470 3-309237645324' 2026-03-08T22:44:49.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:49.647 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:44:49.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182427 2026-03-08T22:44:49.709 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182427 2026-03-08T22:44:49.709 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-42949672992 2-64424509470 3-309237645324 4-107374182427' 2026-03-08T22:44:49.709 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:49.709 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:44:49.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=335007449096 2026-03-08T22:44:49.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 335007449096 2026-03-08T22:44:49.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-42949672992 2-64424509470 3-309237645324 4-107374182427 5-335007449096' 2026-03-08T22:44:49.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:49.778 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T22:44:49.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=395136991235 2026-03-08T22:44:49.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 395136991235 2026-03-08T22:44:49.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-42949672992 2-64424509470 3-309237645324 4-107374182427 5-335007449096 6-395136991235' 2026-03-08T22:44:49.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:49.853 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T22:44:49.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691863 2026-03-08T22:44:49.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691863 2026-03-08T22:44:49.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-42949672992 2-64424509470 3-309237645324 4-107374182427 5-335007449096 6-395136991235 7-171798691863' 2026-03-08T22:44:49.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:49.931 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T22:44:50.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528342 2026-03-08T22:44:50.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528342 2026-03-08T22:44:50.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-42949672992 2-64424509470 3-309237645324 4-107374182427 5-335007449096 6-395136991235 7-171798691863 8-193273528342' 2026-03-08T22:44:50.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:50.010 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T22:44:50.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=214748364821 2026-03-08T22:44:50.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 214748364821 2026-03-08T22:44:50.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-42949672992 2-64424509470 3-309237645324 4-107374182427 5-335007449096 6-395136991235 7-171798691863 8-193273528342 9-214748364821' 2026-03-08T22:44:50.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:50.084 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.10 flush_pg_stats 2026-03-08T22:44:50.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=236223201299 2026-03-08T22:44:50.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 236223201299 2026-03-08T22:44:50.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-42949672992 2-64424509470 3-309237645324 4-107374182427 5-335007449096 6-395136991235 7-171798691863 8-193273528342 9-214748364821 10-236223201299' 2026-03-08T22:44:50.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:50.154 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836513 2026-03-08T22:44:50.154 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:44:50.155 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836513 2026-03-08T22:44:50.155 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:50.156 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836513 2026-03-08T22:44:50.156 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836513' 2026-03-08T22:44:50.156 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 21474836513 2026-03-08T22:44:50.157 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:44:50.373 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836513 -lt 21474836513 2026-03-08T22:44:50.373 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:50.374 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672992 2026-03-08T22:44:50.374 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:50.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:44:50.375 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672992 2026-03-08T22:44:50.376 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:50.377 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672992 2026-03-08T22:44:50.377 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672992' 2026-03-08T22:44:50.377 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 42949672992 2026-03-08T22:44:50.377 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:44:50.608 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672992 -lt 42949672992 2026-03-08T22:44:50.608 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:50.608 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509470 2026-03-08T22:44:50.608 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:50.609 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:44:50.609 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509470 2026-03-08T22:44:50.609 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:50.610 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509470 2026-03-08T22:44:50.610 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509470' 2026-03-08T22:44:50.610 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 64424509470 2026-03-08T22:44:50.610 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:44:50.843 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509470 -lt 64424509470 2026-03-08T22:44:50.843 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:50.843 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-309237645324 2026-03-08T22:44:50.843 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:50.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:44:50.845 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-309237645324 2026-03-08T22:44:50.845 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:50.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=309237645324 2026-03-08T22:44:50.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 309237645324' 2026-03-08T22:44:50.846 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 309237645324 2026-03-08T22:44:50.846 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:44:51.066 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 309237645324 -lt 309237645324 2026-03-08T22:44:51.066 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:51.067 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182427 2026-03-08T22:44:51.067 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:51.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:44:51.068 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182427 2026-03-08T22:44:51.068 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:51.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182427 2026-03-08T22:44:51.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182427' 2026-03-08T22:44:51.069 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.4 seq 107374182427 2026-03-08T22:44:51.069 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:44:51.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182427 -lt 107374182427 2026-03-08T22:44:51.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:51.290 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-335007449096 2026-03-08T22:44:51.291 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:51.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:44:51.292 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-335007449096 2026-03-08T22:44:51.292 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:51.293 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=335007449096 2026-03-08T22:44:51.293 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 335007449096' 2026-03-08T22:44:51.293 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.5 seq 335007449096 2026-03-08T22:44:51.294 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:44:51.516 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 335007449096 -lt 335007449096 2026-03-08T22:44:51.516 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:51.517 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-395136991235 2026-03-08T22:44:51.517 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:51.518 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T22:44:51.518 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-395136991235 2026-03-08T22:44:51.518 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:51.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=395136991235 2026-03-08T22:44:51.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 395136991235' 2026-03-08T22:44:51.519 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.6 seq 395136991235 2026-03-08T22:44:51.520 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T22:44:51.741 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 395136991235 -lt 395136991235 2026-03-08T22:44:51.742 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:51.742 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691863 2026-03-08T22:44:51.742 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:51.743 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T22:44:51.744 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691863 2026-03-08T22:44:51.744 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:51.745 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691863 2026-03-08T22:44:51.745 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691863' 2026-03-08T22:44:51.745 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.7 seq 171798691863 2026-03-08T22:44:51.745 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T22:44:51.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691864 -lt 171798691863 2026-03-08T22:44:51.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:51.968 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528342 2026-03-08T22:44:51.968 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:51.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T22:44:51.969 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528342 2026-03-08T22:44:51.969 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:51.970 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528342 2026-03-08T22:44:51.970 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528342' 2026-03-08T22:44:51.970 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.8 seq 193273528342 2026-03-08T22:44:51.970 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T22:44:52.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528342 -lt 193273528342 2026-03-08T22:44:52.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:52.214 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-214748364821 2026-03-08T22:44:52.214 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:52.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T22:44:52.216 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-214748364821 2026-03-08T22:44:52.216 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:52.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=214748364821 2026-03-08T22:44:52.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 214748364821' 2026-03-08T22:44:52.217 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.9 seq 214748364821 2026-03-08T22:44:52.217 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T22:44:52.457 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 214748364821 -lt 214748364821 2026-03-08T22:44:52.457 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:52.457 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 10-236223201299 2026-03-08T22:44:52.457 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:52.458 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=10 2026-03-08T22:44:52.459 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 10-236223201299 2026-03-08T22:44:52.459 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:52.460 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=236223201299 2026-03-08T22:44:52.460 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.10 seq 236223201299' 2026-03-08T22:44:52.460 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.10 seq 236223201299 2026-03-08T22:44:52.460 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 10 2026-03-08T22:44:52.737 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 236223201300 -lt 236223201299 2026-03-08T22:44:52.738 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:44:52.738 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:52.738 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:53.064 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 28 == 0 2026-03-08T22:44:53.064 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:44:53.065 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:44:53.065 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:44:53.065 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:44:53.065 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:44:53.065 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:44:53.065 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:44:53.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=28 2026-03-08T22:44:53.284 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:44:53.284 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:44:53.284 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:44:53.571 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 28 = 28 2026-03-08T22:44:53.571 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:44:53.571 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:44:53.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:300: verify_chunk_mapping: objectstore_tool td/test-erasure-code 9 SOMETHINGremap-pool get-bytes 2026-03-08T22:44:53.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-code 2026-03-08T22:44:53.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:44:53.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=9 2026-03-08T22:44:53.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:44:53.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-code 9 SOMETHINGremap-pool get-bytes 2026-03-08T22:44:53.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-code 2026-03-08T22:44:53.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:44:53.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=9 2026-03-08T22:44:53.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:44:53.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-code TERM osd.9 2026-03-08T22:44:53.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:300: verify_chunk_mapping: grep --quiet SECONDremap-pool 2026-03-08T22:44:53.572 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:44:53.572 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:44:53.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:44:53.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:44:53.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:44:53.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:44:53.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-code 9 SOMETHINGremap-pool get-bytes 2026-03-08T22:44:53.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-code 2026-03-08T22:44:53.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:44:53.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=9 2026-03-08T22:44:53.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:44:53.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-code/9 2026-03-08T22:44:53.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-code/9 SOMETHINGremap-pool get-bytes 2026-03-08T22:44:54.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-code 9 2026-03-08T22:44:54.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-code 2026-03-08T22:44:54.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:44:54.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=9 2026-03-08T22:44:54.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:44:54.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-code/9 2026-03-08T22:44:54.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false' 2026-03-08T22:44:54.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:44:54.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:44:54.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:44:54.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-code/9' 2026-03-08T22:44:54.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-code/9/journal' 2026-03-08T22:44:54.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:44:54.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:44:54.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-code' 2026-03-08T22:44:54.268 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:44:54.268 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:44:54.268 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:44:54.268 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:44:54.268 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:44:54.268 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:44:54.268 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:44:54.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' 2026-03-08T22:44:54.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:44:54.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-code/$name.log' 2026-03-08T22:44:54.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-code/$name.pid' 2026-03-08T22:44:54.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:44:54.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:44:54.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:44:54.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:44:54.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:44:54.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:44:54.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-code/9 2026-03-08T22:44:54.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.9 2026-03-08T22:44:54.270 INFO:tasks.workunit.client.0.vm04.stderr:start osd.9 2026-03-08T22:44:54.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 9 --fsid=54af98f1-8849-4bd3-8dde-70ffbbafe96d --auth-supported=none --mon-host=127.0.0.1:7101 --mon-osd-prime-pg-temp=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-code/9 --osd-journal=td/test-erasure-code/9/journal --chdir= --run-dir=td/test-erasure-code '--admin-socket=/tmp/ceph-asok.74377/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-code/$name.log' '--pid-file=td/test-erasure-code/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:44:54.270 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-code/9/whoami 2026-03-08T22:44:54.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 9 = 9 ']' 2026-03-08T22:44:54.272 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:44:54.272 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:44:54.274 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:44:54.294 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:54.294+0000 7f5b32618780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:54.299 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:54.301+0000 7f5b32618780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:54.301 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:54.302+0000 7f5b32618780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:44:54.497 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 9 2026-03-08T22:44:54.497 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:44:54.497 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=9 2026-03-08T22:44:54.497 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:44:54.497 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:44:54.497 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:54.497 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:44:54.497 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:44:54.497 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:54.497 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T22:44:54.723 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:44:55.373 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:55.374+0000 7f5b32618780 -1 Falling back to public interface 2026-03-08T22:44:55.726 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:44:55.726 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:55.726 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:44:55.726 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:44:55.726 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:55.726 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T22:44:55.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:44:56.255 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:56.257+0000 7f5b32618780 -1 osd.9 93 log_to_monitors true 2026-03-08T22:44:56.961 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:44:56.961 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:44:56.961 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:44:56.961 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:44:56.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T22:44:56.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:44:57.110 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:44:57.111+0000 7f5b29004640 -1 osd.9 93 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:44:57.261 INFO:tasks.workunit.client.0.vm04.stderr:osd.9 up in weight 1 up_from 97 up_thru 65 down_at 94 last_clean_interval [50,93) [v2:127.0.0.1:6874/4145537853,v1:127.0.0.1:6875/4145537853] [v2:127.0.0.1:6876/4145537853,v1:127.0.0.1:6877/4145537853] exists,up e624fc0a-0f7d-4c72-92d5-27028e3149b6 2026-03-08T22:44:57.261 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:44:57.261 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:44:57.261 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:44:57.261 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:44:57.261 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:44:57.261 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:44:57.261 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:44:57.261 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:44:57.262 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:44:57.262 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:44:57.262 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:44:57.262 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:44:57.262 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:44:57.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:44:57.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:44:57.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:44:57.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:44:57.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:44:57.352 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:44:57.588 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:44:57.588 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:44:57.589 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:44:57.589 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:44:57.589 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:44:57.589 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:44:57.589 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:44:57.589 INFO:tasks.workunit.client.0.vm04.stderr:7 2026-03-08T22:44:57.589 INFO:tasks.workunit.client.0.vm04.stderr:8 2026-03-08T22:44:57.589 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:44:57.589 INFO:tasks.workunit.client.0.vm04.stderr:10' 2026-03-08T22:44:57.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:44:57.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:57.589 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:44:57.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836517 2026-03-08T22:44:57.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836517 2026-03-08T22:44:57.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517' 2026-03-08T22:44:57.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:57.663 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:44:57.731 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672995 2026-03-08T22:44:57.731 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672995 2026-03-08T22:44:57.731 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517 1-42949672995' 2026-03-08T22:44:57.731 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:57.731 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:44:57.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509474 2026-03-08T22:44:57.806 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509474 2026-03-08T22:44:57.806 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517 1-42949672995 2-64424509474' 2026-03-08T22:44:57.806 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:57.806 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:44:57.888 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=309237645327 2026-03-08T22:44:57.888 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 309237645327 2026-03-08T22:44:57.888 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517 1-42949672995 2-64424509474 3-309237645327' 2026-03-08T22:44:57.888 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:57.888 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:44:57.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182431 2026-03-08T22:44:57.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182431 2026-03-08T22:44:57.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517 1-42949672995 2-64424509474 3-309237645327 4-107374182431' 2026-03-08T22:44:57.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:57.969 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:44:58.044 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=335007449099 2026-03-08T22:44:58.044 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 335007449099 2026-03-08T22:44:58.044 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517 1-42949672995 2-64424509474 3-309237645327 4-107374182431 5-335007449099' 2026-03-08T22:44:58.044 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:58.044 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T22:44:58.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=395136991239 2026-03-08T22:44:58.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 395136991239 2026-03-08T22:44:58.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517 1-42949672995 2-64424509474 3-309237645327 4-107374182431 5-335007449099 6-395136991239' 2026-03-08T22:44:58.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:58.117 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T22:44:58.188 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691867 2026-03-08T22:44:58.189 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691867 2026-03-08T22:44:58.189 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517 1-42949672995 2-64424509474 3-309237645327 4-107374182431 5-335007449099 6-395136991239 7-171798691867' 2026-03-08T22:44:58.189 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:58.189 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T22:44:58.264 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528345 2026-03-08T22:44:58.264 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528345 2026-03-08T22:44:58.264 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517 1-42949672995 2-64424509474 3-309237645327 4-107374182431 5-335007449099 6-395136991239 7-171798691867 8-193273528345' 2026-03-08T22:44:58.264 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:58.264 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T22:44:58.337 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=416611827715 2026-03-08T22:44:58.337 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 416611827715 2026-03-08T22:44:58.337 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517 1-42949672995 2-64424509474 3-309237645327 4-107374182431 5-335007449099 6-395136991239 7-171798691867 8-193273528345 9-416611827715' 2026-03-08T22:44:58.337 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:44:58.338 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.10 flush_pg_stats 2026-03-08T22:44:58.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=236223201303 2026-03-08T22:44:58.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 236223201303 2026-03-08T22:44:58.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517 1-42949672995 2-64424509474 3-309237645327 4-107374182431 5-335007449099 6-395136991239 7-171798691867 8-193273528345 9-416611827715 10-236223201303' 2026-03-08T22:44:58.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:58.416 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836517 2026-03-08T22:44:58.416 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:58.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:44:58.417 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836517 2026-03-08T22:44:58.417 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:58.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836517 2026-03-08T22:44:58.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836517' 2026-03-08T22:44:58.419 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 21474836517 2026-03-08T22:44:58.419 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:44:58.651 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836517 -lt 21474836517 2026-03-08T22:44:58.651 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:58.651 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672995 2026-03-08T22:44:58.651 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:58.652 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:44:58.653 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672995 2026-03-08T22:44:58.653 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:58.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672995 2026-03-08T22:44:58.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672995' 2026-03-08T22:44:58.654 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 42949672995 2026-03-08T22:44:58.654 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:44:58.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672995 -lt 42949672995 2026-03-08T22:44:58.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:58.890 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509474 2026-03-08T22:44:58.890 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:58.892 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:44:58.892 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:58.894 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509474 2026-03-08T22:44:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509474 2026-03-08T22:44:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509474' 2026-03-08T22:44:58.895 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 64424509474 2026-03-08T22:44:58.895 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:44:59.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509474 -lt 64424509474 2026-03-08T22:44:59.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:59.115 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-309237645327 2026-03-08T22:44:59.115 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:59.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:44:59.116 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-309237645327 2026-03-08T22:44:59.117 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:59.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=309237645327 2026-03-08T22:44:59.118 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 309237645327' 2026-03-08T22:44:59.118 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 309237645327 2026-03-08T22:44:59.118 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:44:59.335 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 309237645327 -lt 309237645327 2026-03-08T22:44:59.335 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:44:59.335 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182431 2026-03-08T22:44:59.335 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:44:59.336 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:44:59.336 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182431 2026-03-08T22:44:59.336 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:44:59.337 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182431 2026-03-08T22:44:59.337 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182431' 2026-03-08T22:44:59.337 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.4 seq 107374182431 2026-03-08T22:44:59.338 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:44:59.570 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182429 -lt 107374182431 2026-03-08T22:44:59.571 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:45:00.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:45:00.572 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:45:00.800 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182431 -lt 107374182431 2026-03-08T22:45:00.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:00.801 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-335007449099 2026-03-08T22:45:00.801 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:00.802 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:45:00.802 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-335007449099 2026-03-08T22:45:00.802 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:00.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=335007449099 2026-03-08T22:45:00.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 335007449099' 2026-03-08T22:45:00.803 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.5 seq 335007449099 2026-03-08T22:45:00.803 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:45:01.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 335007449100 -lt 335007449099 2026-03-08T22:45:01.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:01.021 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-395136991239 2026-03-08T22:45:01.021 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:01.022 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T22:45:01.022 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-395136991239 2026-03-08T22:45:01.023 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:01.023 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=395136991239 2026-03-08T22:45:01.023 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 395136991239' 2026-03-08T22:45:01.023 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.6 seq 395136991239 2026-03-08T22:45:01.024 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T22:45:01.281 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 395136991239 -lt 395136991239 2026-03-08T22:45:01.281 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:01.281 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691867 2026-03-08T22:45:01.281 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:01.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T22:45:01.283 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691867 2026-03-08T22:45:01.283 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:01.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691867 2026-03-08T22:45:01.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691867' 2026-03-08T22:45:01.284 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.7 seq 171798691867 2026-03-08T22:45:01.284 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T22:45:01.508 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691867 -lt 171798691867 2026-03-08T22:45:01.508 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:01.508 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528345 2026-03-08T22:45:01.508 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:01.509 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T22:45:01.509 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528345 2026-03-08T22:45:01.509 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:01.510 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528345 2026-03-08T22:45:01.510 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528345' 2026-03-08T22:45:01.510 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.8 seq 193273528345 2026-03-08T22:45:01.510 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T22:45:01.745 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528346 -lt 193273528345 2026-03-08T22:45:01.746 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:01.746 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-416611827715 2026-03-08T22:45:01.746 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:01.747 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T22:45:01.748 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-416611827715 2026-03-08T22:45:01.748 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:01.749 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=416611827715 2026-03-08T22:45:01.749 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 416611827715' 2026-03-08T22:45:01.749 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.9 seq 416611827715 2026-03-08T22:45:01.749 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T22:45:01.975 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 416611827716 -lt 416611827715 2026-03-08T22:45:01.975 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:01.975 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 10-236223201303 2026-03-08T22:45:01.976 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:01.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=10 2026-03-08T22:45:01.977 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 10-236223201303 2026-03-08T22:45:01.977 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:01.978 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=236223201303 2026-03-08T22:45:01.978 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.10 seq 236223201303' 2026-03-08T22:45:01.978 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.10 seq 236223201303 2026-03-08T22:45:01.978 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 10 2026-03-08T22:45:02.206 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 236223201304 -lt 236223201303 2026-03-08T22:45:02.207 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:45:02.207 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:45:02.207 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:45:02.603 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 28 == 0 2026-03-08T22:45:02.603 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:45:02.603 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:45:02.603 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:45:02.604 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:45:02.604 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:45:02.604 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:45:02.604 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:45:02.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=28 2026-03-08T22:45:02.829 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:45:02.829 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:45:02.829 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:45:03.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 28 = 28 2026-03-08T22:45:03.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:45:03.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:45:03.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:329: TEST_chunk_mapping: delete_pool remap-pool 2026-03-08T22:45:03.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: delete_pool: local poolname=remap-pool 2026-03-08T22:45:03.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: delete_pool: ceph osd pool delete remap-pool remap-pool --yes-i-really-really-mean-it 2026-03-08T22:45:03.529 INFO:tasks.workunit.client.0.vm04.stderr:pool 'remap-pool' does not exist 2026-03-08T22:45:03.541 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:330: TEST_chunk_mapping: ceph osd erasure-code-profile rm remap-profile 2026-03-08T22:45:03.837 INFO:tasks.workunit.client.0.vm04.stderr:erasure-code-profile remap-profile does not exist 2026-03-08T22:45:03.848 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:47: run: for func in $funcs 2026-03-08T22:45:03.848 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:48: run: TEST_rados_put_get_isa td/test-erasure-code 2026-03-08T22:45:03.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:193: TEST_rados_put_get_isa: erasure_code_plugin_exists isa 2026-03-08T22:45:03.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2114: erasure_code_plugin_exists: local plugin=isa 2026-03-08T22:45:03.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2115: erasure_code_plugin_exists: local status 2026-03-08T22:45:03.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2116: erasure_code_plugin_exists: local grepstr 2026-03-08T22:45:03.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2117: erasure_code_plugin_exists: local s 2026-03-08T22:45:03.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2118: erasure_code_plugin_exists: case `uname` in 2026-03-08T22:45:03.849 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2118: erasure_code_plugin_exists: uname 2026-03-08T22:45:03.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2120: erasure_code_plugin_exists: grepstr='isa.*No such file' 2026-03-08T22:45:03.850 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2123: erasure_code_plugin_exists: ceph osd erasure-code-profile set TESTPROFILE plugin=isa 2026-03-08T22:45:04.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2123: erasure_code_plugin_exists: s= 2026-03-08T22:45:04.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2124: erasure_code_plugin_exists: local status=0 2026-03-08T22:45:04.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2125: erasure_code_plugin_exists: '[' 0 -eq 0 ']' 2026-03-08T22:45:04.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2126: erasure_code_plugin_exists: ceph osd erasure-code-profile rm TESTPROFILE 2026-03-08T22:45:04.436 INFO:tasks.workunit.client.0.vm04.stderr:erasure-code-profile TESTPROFILE does not exist 2026-03-08T22:45:04.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2132: erasure_code_plugin_exists: return 0 2026-03-08T22:45:04.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:197: TEST_rados_put_get_isa: local dir=td/test-erasure-code 2026-03-08T22:45:04.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:198: TEST_rados_put_get_isa: local poolname=pool-isa 2026-03-08T22:45:04.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:200: TEST_rados_put_get_isa: ceph osd erasure-code-profile set profile-isa plugin=isa crush-failure-domain=osd 2026-03-08T22:45:04.741 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:203: TEST_rados_put_get_isa: create_pool pool-isa 1 1 erasure profile-isa 2026-03-08T22:45:04.741 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create pool-isa 1 1 erasure profile-isa 2026-03-08T22:45:05.117 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-isa' already exists 2026-03-08T22:45:05.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:45:06.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:206: TEST_rados_put_get_isa: rados_put_get td/test-erasure-code pool-isa 2026-03-08T22:45:06.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:66: rados_put_get: local dir=td/test-erasure-code 2026-03-08T22:45:06.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:67: rados_put_get: local poolname=pool-isa 2026-03-08T22:45:06.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:68: rados_put_get: local objname=SOMETHING 2026-03-08T22:45:06.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:45:06.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 AAA 2026-03-08T22:45:06.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:45:06.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 BBB 2026-03-08T22:45:06.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:45:06.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 CCCC 2026-03-08T22:45:06.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:45:06.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 DDDD 2026-03-08T22:45:06.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:78: rados_put_get: rados --pool pool-isa put SOMETHING td/test-erasure-code/ORIGINAL 2026-03-08T22:45:06.211 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:79: rados_put_get: rados --pool pool-isa get SOMETHING td/test-erasure-code/COPY 2026-03-08T22:45:06.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:80: rados_put_get: diff td/test-erasure-code/ORIGINAL td/test-erasure-code/COPY 2026-03-08T22:45:06.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:81: rados_put_get: rm td/test-erasure-code/COPY 2026-03-08T22:45:06.238 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:88: rados_put_get: get_osds pool-isa SOMETHING 2026-03-08T22:45:06.238 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-isa 2026-03-08T22:45:06.238 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T22:45:06.239 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-isa SOMETHING 2026-03-08T22:45:06.239 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:45:06.476 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=4 2026-03-08T22:45:06.476 INFO:tasks.workunit.client.0.vm04.stderr:10 2026-03-08T22:45:06.476 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:45:06.476 INFO:tasks.workunit.client.0.vm04.stderr:7 2026-03-08T22:45:06.476 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:45:06.476 INFO:tasks.workunit.client.0.vm04.stderr:8 2026-03-08T22:45:06.476 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:45:06.476 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:45:06.476 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:45:06.476 INFO:tasks.workunit.client.0.vm04.stderr:9' 2026-03-08T22:45:06.476 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 4 10 0 7 6 8 2 1 5 9 2026-03-08T22:45:06.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:88: rados_put_get: initial_osds=('4' '10' '0' '7' '6' '8' '2' '1' '5' '9') 2026-03-08T22:45:06.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:88: rados_put_get: local -a initial_osds 2026-03-08T22:45:06.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:89: rados_put_get: local last=9 2026-03-08T22:45:06.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:90: rados_put_get: ceph osd out 9 2026-03-08T22:45:06.904 INFO:tasks.workunit.client.0.vm04.stderr:osd.9 is already out. 2026-03-08T22:45:06.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:93: rados_put_get: sleep 5 2026-03-08T22:45:11.918 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:94: rados_put_get: get_osds pool-isa SOMETHING 2026-03-08T22:45:11.918 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-isa 2026-03-08T22:45:11.918 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T22:45:11.918 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:94: rados_put_get: grep '\<9\>' 2026-03-08T22:45:11.918 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-isa SOMETHING 2026-03-08T22:45:11.918 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:45:12.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=4 2026-03-08T22:45:12.161 INFO:tasks.workunit.client.0.vm04.stderr:10 2026-03-08T22:45:12.161 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:45:12.161 INFO:tasks.workunit.client.0.vm04.stderr:7 2026-03-08T22:45:12.161 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:45:12.161 INFO:tasks.workunit.client.0.vm04.stderr:8 2026-03-08T22:45:12.161 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:45:12.161 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:45:12.161 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:45:12.161 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T22:45:12.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 4 10 0 7 6 8 3 1 5 2 2026-03-08T22:45:12.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:96: rados_put_get: rados --pool pool-isa get SOMETHING td/test-erasure-code/COPY 2026-03-08T22:45:12.187 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:97: rados_put_get: diff td/test-erasure-code/ORIGINAL td/test-erasure-code/COPY 2026-03-08T22:45:12.188 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:98: rados_put_get: ceph osd in 9 2026-03-08T22:45:12.642 INFO:tasks.workunit.client.0.vm04.stderr:osd.9 is already in. 2026-03-08T22:45:12.653 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:100: rados_put_get: rm td/test-erasure-code/ORIGINAL 2026-03-08T22:45:12.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:208: TEST_rados_put_get_isa: delete_pool pool-isa 2026-03-08T22:45:12.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: delete_pool: local poolname=pool-isa 2026-03-08T22:45:12.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: delete_pool: ceph osd pool delete pool-isa pool-isa --yes-i-really-really-mean-it 2026-03-08T22:45:13.024 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-isa' does not exist 2026-03-08T22:45:13.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:47: run: for func in $funcs 2026-03-08T22:45:13.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:48: run: TEST_rados_put_get_jerasure td/test-erasure-code 2026-03-08T22:45:13.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:212: TEST_rados_put_get_jerasure: local dir=td/test-erasure-code 2026-03-08T22:45:13.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:214: TEST_rados_put_get_jerasure: rados_put_get td/test-erasure-code ecpool 2026-03-08T22:45:13.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:66: rados_put_get: local dir=td/test-erasure-code 2026-03-08T22:45:13.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:67: rados_put_get: local poolname=ecpool 2026-03-08T22:45:13.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:68: rados_put_get: local objname=SOMETHING 2026-03-08T22:45:13.036 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:45:13.036 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 AAA 2026-03-08T22:45:13.036 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:45:13.036 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 BBB 2026-03-08T22:45:13.036 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:45:13.036 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 CCCC 2026-03-08T22:45:13.036 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:45:13.036 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 DDDD 2026-03-08T22:45:13.036 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:78: rados_put_get: rados --pool ecpool put SOMETHING td/test-erasure-code/ORIGINAL 2026-03-08T22:45:13.077 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:79: rados_put_get: rados --pool ecpool get SOMETHING td/test-erasure-code/COPY 2026-03-08T22:45:13.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:80: rados_put_get: diff td/test-erasure-code/ORIGINAL td/test-erasure-code/COPY 2026-03-08T22:45:13.105 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:81: rados_put_get: rm td/test-erasure-code/COPY 2026-03-08T22:45:13.106 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:88: rados_put_get: get_osds ecpool SOMETHING 2026-03-08T22:45:13.106 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=ecpool 2026-03-08T22:45:13.106 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T22:45:13.107 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:45:13.107 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:45:13.332 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=6 2026-03-08T22:45:13.332 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:45:13.332 INFO:tasks.workunit.client.0.vm04.stderr:7 2026-03-08T22:45:13.332 INFO:tasks.workunit.client.0.vm04.stderr:9' 2026-03-08T22:45:13.332 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 6 0 7 9 2026-03-08T22:45:13.333 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:88: rados_put_get: initial_osds=('6' '0' '7' '9') 2026-03-08T22:45:13.333 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:88: rados_put_get: local -a initial_osds 2026-03-08T22:45:13.333 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:89: rados_put_get: local last=3 2026-03-08T22:45:13.333 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:90: rados_put_get: ceph osd out 9 2026-03-08T22:45:13.610 INFO:tasks.workunit.client.0.vm04.stderr:osd.9 is already out. 2026-03-08T22:45:13.621 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:93: rados_put_get: sleep 5 2026-03-08T22:45:18.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:94: rados_put_get: get_osds ecpool SOMETHING 2026-03-08T22:45:18.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:94: rados_put_get: grep '\<9\>' 2026-03-08T22:45:18.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=ecpool 2026-03-08T22:45:18.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T22:45:18.624 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:45:18.624 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:45:18.958 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=6 2026-03-08T22:45:18.958 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:45:18.958 INFO:tasks.workunit.client.0.vm04.stderr:7 2026-03-08T22:45:18.958 INFO:tasks.workunit.client.0.vm04.stderr:5' 2026-03-08T22:45:18.958 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 6 0 7 5 2026-03-08T22:45:18.958 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:96: rados_put_get: rados --pool ecpool get SOMETHING td/test-erasure-code/COPY 2026-03-08T22:45:18.992 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:97: rados_put_get: diff td/test-erasure-code/ORIGINAL td/test-erasure-code/COPY 2026-03-08T22:45:18.992 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:98: rados_put_get: ceph osd in 9 2026-03-08T22:45:19.486 INFO:tasks.workunit.client.0.vm04.stderr:osd.9 is already in. 2026-03-08T22:45:19.496 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:100: rados_put_get: rm td/test-erasure-code/ORIGINAL 2026-03-08T22:45:19.497 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:216: TEST_rados_put_get_jerasure: local poolname=pool-jerasure 2026-03-08T22:45:19.497 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:217: TEST_rados_put_get_jerasure: local profile=profile-jerasure 2026-03-08T22:45:19.497 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:219: TEST_rados_put_get_jerasure: ceph osd erasure-code-profile set profile-jerasure plugin=jerasure k=4 m=2 crush-failure-domain=osd 2026-03-08T22:45:19.814 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:223: TEST_rados_put_get_jerasure: create_pool pool-jerasure 12 12 erasure profile-jerasure 2026-03-08T22:45:19.814 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create pool-jerasure 12 12 erasure profile-jerasure 2026-03-08T22:45:20.250 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' already exists 2026-03-08T22:45:20.261 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:45:21.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:226: TEST_rados_put_get_jerasure: rados_put_get td/test-erasure-code pool-jerasure 2026-03-08T22:45:21.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:66: rados_put_get: local dir=td/test-erasure-code 2026-03-08T22:45:21.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:67: rados_put_get: local poolname=pool-jerasure 2026-03-08T22:45:21.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:68: rados_put_get: local objname=SOMETHING 2026-03-08T22:45:21.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:45:21.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 AAA 2026-03-08T22:45:21.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:45:21.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 BBB 2026-03-08T22:45:21.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:45:21.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 CCCC 2026-03-08T22:45:21.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:45:21.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 DDDD 2026-03-08T22:45:21.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:78: rados_put_get: rados --pool pool-jerasure put SOMETHING td/test-erasure-code/ORIGINAL 2026-03-08T22:45:21.293 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:79: rados_put_get: rados --pool pool-jerasure get SOMETHING td/test-erasure-code/COPY 2026-03-08T22:45:21.318 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:80: rados_put_get: diff td/test-erasure-code/ORIGINAL td/test-erasure-code/COPY 2026-03-08T22:45:21.319 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:81: rados_put_get: rm td/test-erasure-code/COPY 2026-03-08T22:45:21.320 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:88: rados_put_get: get_osds pool-jerasure SOMETHING 2026-03-08T22:45:21.320 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:45:21.320 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T22:45:21.321 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure SOMETHING 2026-03-08T22:45:21.321 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:45:21.540 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=5 2026-03-08T22:45:21.540 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:45:21.540 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:45:21.540 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:45:21.540 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:45:21.540 INFO:tasks.workunit.client.0.vm04.stderr:10' 2026-03-08T22:45:21.540 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 5 1 9 4 3 10 2026-03-08T22:45:21.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:88: rados_put_get: initial_osds=('5' '1' '9' '4' '3' '10') 2026-03-08T22:45:21.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:88: rados_put_get: local -a initial_osds 2026-03-08T22:45:21.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:89: rados_put_get: local last=5 2026-03-08T22:45:21.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:90: rados_put_get: ceph osd out 10 2026-03-08T22:45:21.821 INFO:tasks.workunit.client.0.vm04.stderr:osd.10 is already out. 2026-03-08T22:45:21.831 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:93: rados_put_get: sleep 5 2026-03-08T22:45:26.836 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:94: rados_put_get: get_osds pool-jerasure SOMETHING 2026-03-08T22:45:26.836 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:45:26.836 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T22:45:26.836 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:94: rados_put_get: grep '\<10\>' 2026-03-08T22:45:26.836 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure SOMETHING 2026-03-08T22:45:26.836 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:45:27.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=5 2026-03-08T22:45:27.058 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:45:27.058 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:45:27.058 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:45:27.059 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:45:27.059 INFO:tasks.workunit.client.0.vm04.stderr:6' 2026-03-08T22:45:27.059 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 5 1 9 4 3 6 2026-03-08T22:45:27.059 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:96: rados_put_get: rados --pool pool-jerasure get SOMETHING td/test-erasure-code/COPY 2026-03-08T22:45:27.085 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:97: rados_put_get: diff td/test-erasure-code/ORIGINAL td/test-erasure-code/COPY 2026-03-08T22:45:27.086 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:98: rados_put_get: ceph osd in 10 2026-03-08T22:45:27.335 INFO:tasks.workunit.client.0.vm04.stderr:osd.10 is already in. 2026-03-08T22:45:27.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:100: rados_put_get: rm td/test-erasure-code/ORIGINAL 2026-03-08T22:45:27.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:227: TEST_rados_put_get_jerasure: rados_osds_out_in td/test-erasure-code pool-jerasure 2026-03-08T22:45:27.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:104: rados_osds_out_in: local dir=td/test-erasure-code 2026-03-08T22:45:27.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:105: rados_osds_out_in: local poolname=pool-jerasure 2026-03-08T22:45:27.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:106: rados_osds_out_in: local objname=SOMETHING 2026-03-08T22:45:27.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:109: rados_osds_out_in: for marker in FFFF GGGG HHHH IIII 2026-03-08T22:45:27.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:110: rados_osds_out_in: printf '%*s' 1024 FFFF 2026-03-08T22:45:27.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:109: rados_osds_out_in: for marker in FFFF GGGG HHHH IIII 2026-03-08T22:45:27.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:110: rados_osds_out_in: printf '%*s' 1024 GGGG 2026-03-08T22:45:27.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:109: rados_osds_out_in: for marker in FFFF GGGG HHHH IIII 2026-03-08T22:45:27.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:110: rados_osds_out_in: printf '%*s' 1024 HHHH 2026-03-08T22:45:27.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:109: rados_osds_out_in: for marker in FFFF GGGG HHHH IIII 2026-03-08T22:45:27.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:110: rados_osds_out_in: printf '%*s' 1024 IIII 2026-03-08T22:45:27.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:116: rados_osds_out_in: rados --pool pool-jerasure put SOMETHING td/test-erasure-code/ORIGINAL 2026-03-08T22:45:27.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:117: rados_osds_out_in: rados --pool pool-jerasure get SOMETHING td/test-erasure-code/COPY 2026-03-08T22:45:27.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:118: rados_osds_out_in: diff td/test-erasure-code/ORIGINAL td/test-erasure-code/COPY 2026-03-08T22:45:27.603 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:119: rados_osds_out_in: rm td/test-erasure-code/COPY 2026-03-08T22:45:27.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:127: rados_osds_out_in: wait_for_clean 2026-03-08T22:45:27.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:45:27.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:45:27.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:45:27.605 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:45:27.605 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:45:27.605 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:45:27.605 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:45:27.605 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:45:27.605 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:45:27.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:45:27.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:45:27.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:45:27.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:45:27.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:45:27.681 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:45:27.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:45:27.900 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:45:27.900 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:45:27.900 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:45:27.900 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:45:27.900 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:45:27.900 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:45:27.900 INFO:tasks.workunit.client.0.vm04.stderr:7 2026-03-08T22:45:27.900 INFO:tasks.workunit.client.0.vm04.stderr:8 2026-03-08T22:45:27.900 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:45:27.900 INFO:tasks.workunit.client.0.vm04.stderr:10' 2026-03-08T22:45:27.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:45:27.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:27.900 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:45:27.971 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836525 2026-03-08T22:45:27.971 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836525 2026-03-08T22:45:27.971 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525' 2026-03-08T22:45:27.971 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:27.971 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:45:28.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673003 2026-03-08T22:45:28.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673003 2026-03-08T22:45:28.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525 1-42949673003' 2026-03-08T22:45:28.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:28.041 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:45:28.112 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509482 2026-03-08T22:45:28.112 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509482 2026-03-08T22:45:28.112 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525 1-42949673003 2-64424509482' 2026-03-08T22:45:28.112 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:28.113 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:45:28.183 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=309237645335 2026-03-08T22:45:28.183 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 309237645335 2026-03-08T22:45:28.183 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525 1-42949673003 2-64424509482 3-309237645335' 2026-03-08T22:45:28.184 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:28.184 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:45:28.257 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182439 2026-03-08T22:45:28.257 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182439 2026-03-08T22:45:28.257 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525 1-42949673003 2-64424509482 3-309237645335 4-107374182439' 2026-03-08T22:45:28.257 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:28.257 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:45:28.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=335007449107 2026-03-08T22:45:28.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 335007449107 2026-03-08T22:45:28.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525 1-42949673003 2-64424509482 3-309237645335 4-107374182439 5-335007449107' 2026-03-08T22:45:28.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:28.332 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T22:45:28.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=395136991247 2026-03-08T22:45:28.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 395136991247 2026-03-08T22:45:28.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525 1-42949673003 2-64424509482 3-309237645335 4-107374182439 5-335007449107 6-395136991247' 2026-03-08T22:45:28.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:28.406 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T22:45:28.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691875 2026-03-08T22:45:28.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691875 2026-03-08T22:45:28.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525 1-42949673003 2-64424509482 3-309237645335 4-107374182439 5-335007449107 6-395136991247 7-171798691875' 2026-03-08T22:45:28.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:28.477 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T22:45:28.550 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528353 2026-03-08T22:45:28.550 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528353 2026-03-08T22:45:28.550 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525 1-42949673003 2-64424509482 3-309237645335 4-107374182439 5-335007449107 6-395136991247 7-171798691875 8-193273528353' 2026-03-08T22:45:28.550 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:28.551 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T22:45:28.626 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=416611827723 2026-03-08T22:45:28.626 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 416611827723 2026-03-08T22:45:28.626 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525 1-42949673003 2-64424509482 3-309237645335 4-107374182439 5-335007449107 6-395136991247 7-171798691875 8-193273528353 9-416611827723' 2026-03-08T22:45:28.626 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:28.627 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.10 flush_pg_stats 2026-03-08T22:45:28.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=236223201311 2026-03-08T22:45:28.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 236223201311 2026-03-08T22:45:28.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525 1-42949673003 2-64424509482 3-309237645335 4-107374182439 5-335007449107 6-395136991247 7-171798691875 8-193273528353 9-416611827723 10-236223201311' 2026-03-08T22:45:28.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:28.698 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836525 2026-03-08T22:45:28.698 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:28.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:45:28.700 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:28.700 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836525 2026-03-08T22:45:28.701 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836525 2026-03-08T22:45:28.701 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836525' 2026-03-08T22:45:28.701 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 21474836525 2026-03-08T22:45:28.702 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:45:28.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836523 -lt 21474836525 2026-03-08T22:45:28.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:45:29.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:45:29.925 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:45:30.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836525 -lt 21474836525 2026-03-08T22:45:30.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:30.149 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673003 2026-03-08T22:45:30.149 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:30.150 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:45:30.151 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673003 2026-03-08T22:45:30.151 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:30.152 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673003 2026-03-08T22:45:30.152 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673003' 2026-03-08T22:45:30.152 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 42949673003 2026-03-08T22:45:30.152 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:45:30.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673004 -lt 42949673003 2026-03-08T22:45:30.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:30.367 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509482 2026-03-08T22:45:30.367 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:30.368 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:45:30.368 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509482 2026-03-08T22:45:30.368 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:30.369 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509482 2026-03-08T22:45:30.369 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509482' 2026-03-08T22:45:30.369 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 64424509482 2026-03-08T22:45:30.369 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:45:30.586 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509482 -lt 64424509482 2026-03-08T22:45:30.586 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:30.587 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-309237645335 2026-03-08T22:45:30.587 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:30.588 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:45:30.588 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-309237645335 2026-03-08T22:45:30.588 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:30.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=309237645335 2026-03-08T22:45:30.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 309237645335' 2026-03-08T22:45:30.589 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 309237645335 2026-03-08T22:45:30.589 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:45:30.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 309237645336 -lt 309237645335 2026-03-08T22:45:30.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:30.842 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182439 2026-03-08T22:45:30.842 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:30.842 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:45:30.843 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182439 2026-03-08T22:45:30.843 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:30.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182439 2026-03-08T22:45:30.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182439' 2026-03-08T22:45:30.844 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.4 seq 107374182439 2026-03-08T22:45:30.844 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:45:31.064 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182439 -lt 107374182439 2026-03-08T22:45:31.064 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:31.064 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-335007449107 2026-03-08T22:45:31.064 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:45:31.066 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-335007449107 2026-03-08T22:45:31.066 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:31.067 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=335007449107 2026-03-08T22:45:31.067 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 335007449107' 2026-03-08T22:45:31.067 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.5 seq 335007449107 2026-03-08T22:45:31.067 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:45:31.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 335007449108 -lt 335007449107 2026-03-08T22:45:31.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:31.280 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-395136991247 2026-03-08T22:45:31.280 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:31.281 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T22:45:31.281 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-395136991247 2026-03-08T22:45:31.281 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:31.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=395136991247 2026-03-08T22:45:31.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 395136991247' 2026-03-08T22:45:31.282 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.6 seq 395136991247 2026-03-08T22:45:31.282 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T22:45:31.502 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 395136991247 -lt 395136991247 2026-03-08T22:45:31.502 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:31.503 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691875 2026-03-08T22:45:31.503 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:31.504 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T22:45:31.504 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691875 2026-03-08T22:45:31.504 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:31.505 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691875 2026-03-08T22:45:31.505 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691875' 2026-03-08T22:45:31.505 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.7 seq 171798691875 2026-03-08T22:45:31.505 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T22:45:31.722 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691875 -lt 171798691875 2026-03-08T22:45:31.722 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:31.722 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528353 2026-03-08T22:45:31.722 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:31.723 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T22:45:31.724 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528353 2026-03-08T22:45:31.724 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:31.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528353 2026-03-08T22:45:31.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528353' 2026-03-08T22:45:31.725 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.8 seq 193273528353 2026-03-08T22:45:31.725 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T22:45:31.945 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528354 -lt 193273528353 2026-03-08T22:45:31.945 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:31.946 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-416611827723 2026-03-08T22:45:31.946 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:31.947 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T22:45:31.947 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-416611827723 2026-03-08T22:45:31.947 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:31.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=416611827723 2026-03-08T22:45:31.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 416611827723' 2026-03-08T22:45:31.948 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.9 seq 416611827723 2026-03-08T22:45:31.948 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T22:45:32.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 416611827724 -lt 416611827723 2026-03-08T22:45:32.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:32.167 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 10-236223201311 2026-03-08T22:45:32.167 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:32.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=10 2026-03-08T22:45:32.169 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 10-236223201311 2026-03-08T22:45:32.169 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:32.169 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=236223201311 2026-03-08T22:45:32.169 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.10 seq 236223201311 2026-03-08T22:45:32.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.10 seq 236223201311' 2026-03-08T22:45:32.170 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 10 2026-03-08T22:45:32.392 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 236223201312 -lt 236223201311 2026-03-08T22:45:32.392 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:45:32.393 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:45:32.393 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:45:32.683 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 28 == 0 2026-03-08T22:45:32.683 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:45:32.684 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:45:32.684 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:45:32.684 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:45:32.684 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:45:32.684 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:45:32.684 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:45:32.896 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=28 2026-03-08T22:45:32.896 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:45:32.896 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:45:32.896 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:45:33.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 28 = 28 2026-03-08T22:45:33.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:45:33.187 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:45:33.187 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:128: rados_osds_out_in: get_osds pool-jerasure SOMETHING 2026-03-08T22:45:33.187 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:45:33.187 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T22:45:33.187 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure SOMETHING 2026-03-08T22:45:33.188 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:45:33.417 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=5 2026-03-08T22:45:33.417 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:45:33.417 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:45:33.417 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:45:33.417 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:45:33.417 INFO:tasks.workunit.client.0.vm04.stderr:10' 2026-03-08T22:45:33.417 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 5 1 9 4 3 10 2026-03-08T22:45:33.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:128: rados_osds_out_in: local 'osds_list=5 1 9 4 3 10' 2026-03-08T22:45:33.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:129: rados_osds_out_in: osds=('5' '1' '9' '4' '3' '10') 2026-03-08T22:45:33.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:129: rados_osds_out_in: local -a osds 2026-03-08T22:45:33.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:130: rados_osds_out_in: for osd in 0 1 2026-03-08T22:45:33.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:131: rados_osds_out_in: ceph osd out 5 2026-03-08T22:45:33.714 INFO:tasks.workunit.client.0.vm04.stderr:osd.5 is already out. 2026-03-08T22:45:33.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:130: rados_osds_out_in: for osd in 0 1 2026-03-08T22:45:33.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:131: rados_osds_out_in: ceph osd out 1 2026-03-08T22:45:34.131 INFO:tasks.workunit.client.0.vm04.stderr:osd.1 is already out. 2026-03-08T22:45:34.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:133: rados_osds_out_in: wait_for_clean 2026-03-08T22:45:34.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:45:34.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:45:34.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:45:34.180 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:45:34.181 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:45:34.181 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:45:34.181 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:45:34.181 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:45:34.181 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:45:34.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:45:34.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:45:34.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:45:34.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:45:34.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:45:34.283 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:45:34.499 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:45:34.499 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:45:34.499 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:45:34.499 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:45:34.499 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:45:34.499 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:45:34.499 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:45:34.499 INFO:tasks.workunit.client.0.vm04.stderr:7 2026-03-08T22:45:34.499 INFO:tasks.workunit.client.0.vm04.stderr:8 2026-03-08T22:45:34.499 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:45:34.499 INFO:tasks.workunit.client.0.vm04.stderr:10' 2026-03-08T22:45:34.499 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:45:34.499 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:34.499 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:45:34.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836528 2026-03-08T22:45:34.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836528 2026-03-08T22:45:34.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836528' 2026-03-08T22:45:34.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:34.574 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:45:34.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673007 2026-03-08T22:45:34.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673007 2026-03-08T22:45:34.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836528 1-42949673007' 2026-03-08T22:45:34.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:34.647 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:45:34.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509485 2026-03-08T22:45:34.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509485 2026-03-08T22:45:34.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836528 1-42949673007 2-64424509485' 2026-03-08T22:45:34.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:34.719 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:45:34.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=309237645339 2026-03-08T22:45:34.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 309237645339 2026-03-08T22:45:34.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836528 1-42949673007 2-64424509485 3-309237645339' 2026-03-08T22:45:34.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:34.791 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:45:34.864 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182442 2026-03-08T22:45:34.864 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182442 2026-03-08T22:45:34.864 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836528 1-42949673007 2-64424509485 3-309237645339 4-107374182442' 2026-03-08T22:45:34.864 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:34.864 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:45:34.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=335007449111 2026-03-08T22:45:34.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 335007449111 2026-03-08T22:45:34.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836528 1-42949673007 2-64424509485 3-309237645339 4-107374182442 5-335007449111' 2026-03-08T22:45:34.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:34.937 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T22:45:35.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=395136991250 2026-03-08T22:45:35.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 395136991250 2026-03-08T22:45:35.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836528 1-42949673007 2-64424509485 3-309237645339 4-107374182442 5-335007449111 6-395136991250' 2026-03-08T22:45:35.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:35.010 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T22:45:35.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691878 2026-03-08T22:45:35.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691878 2026-03-08T22:45:35.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836528 1-42949673007 2-64424509485 3-309237645339 4-107374182442 5-335007449111 6-395136991250 7-171798691878' 2026-03-08T22:45:35.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:35.083 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T22:45:35.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528357 2026-03-08T22:45:35.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528357 2026-03-08T22:45:35.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836528 1-42949673007 2-64424509485 3-309237645339 4-107374182442 5-335007449111 6-395136991250 7-171798691878 8-193273528357' 2026-03-08T22:45:35.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:35.160 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T22:45:35.232 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=416611827726 2026-03-08T22:45:35.232 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 416611827726 2026-03-08T22:45:35.232 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836528 1-42949673007 2-64424509485 3-309237645339 4-107374182442 5-335007449111 6-395136991250 7-171798691878 8-193273528357 9-416611827726' 2026-03-08T22:45:35.232 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:35.233 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.10 flush_pg_stats 2026-03-08T22:45:35.305 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=236223201314 2026-03-08T22:45:35.305 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 236223201314 2026-03-08T22:45:35.305 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836528 1-42949673007 2-64424509485 3-309237645339 4-107374182442 5-335007449111 6-395136991250 7-171798691878 8-193273528357 9-416611827726 10-236223201314' 2026-03-08T22:45:35.305 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:35.305 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:35.305 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836528 2026-03-08T22:45:35.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:45:35.307 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836528 2026-03-08T22:45:35.307 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:35.308 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836528 2026-03-08T22:45:35.308 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836528' 2026-03-08T22:45:35.308 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 21474836528 2026-03-08T22:45:35.308 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:45:35.521 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836526 -lt 21474836528 2026-03-08T22:45:35.521 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:45:36.522 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:45:36.522 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:45:36.735 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836529 -lt 21474836528 2026-03-08T22:45:36.735 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:36.736 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673007 2026-03-08T22:45:36.736 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:36.737 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:45:36.738 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673007 2026-03-08T22:45:36.738 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:36.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673007 2026-03-08T22:45:36.739 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 42949673007 2026-03-08T22:45:36.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673007' 2026-03-08T22:45:36.739 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:45:36.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673007 -lt 42949673007 2026-03-08T22:45:36.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:36.969 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509485 2026-03-08T22:45:36.969 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:36.970 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:45:36.971 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509485 2026-03-08T22:45:36.971 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:36.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509485 2026-03-08T22:45:36.972 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 64424509485 2026-03-08T22:45:36.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509485' 2026-03-08T22:45:36.973 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:45:37.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509486 -lt 64424509485 2026-03-08T22:45:37.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:37.216 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-309237645339 2026-03-08T22:45:37.216 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:37.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:45:37.218 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-309237645339 2026-03-08T22:45:37.218 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:37.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=309237645339 2026-03-08T22:45:37.219 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 309237645339 2026-03-08T22:45:37.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 309237645339' 2026-03-08T22:45:37.219 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:45:37.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 309237645339 -lt 309237645339 2026-03-08T22:45:37.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:37.473 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182442 2026-03-08T22:45:37.473 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:37.474 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:45:37.475 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182442 2026-03-08T22:45:37.475 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:37.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182442 2026-03-08T22:45:37.476 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.4 seq 107374182442 2026-03-08T22:45:37.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182442' 2026-03-08T22:45:37.476 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:45:37.704 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182442 -lt 107374182442 2026-03-08T22:45:37.704 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:37.705 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-335007449111 2026-03-08T22:45:37.705 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:37.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:45:37.707 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-335007449111 2026-03-08T22:45:37.707 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:37.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=335007449111 2026-03-08T22:45:37.708 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.5 seq 335007449111 2026-03-08T22:45:37.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 335007449111' 2026-03-08T22:45:37.709 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:45:37.985 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 335007449111 -lt 335007449111 2026-03-08T22:45:37.985 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:37.985 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-395136991250 2026-03-08T22:45:37.986 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:37.987 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T22:45:37.987 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-395136991250 2026-03-08T22:45:37.987 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:37.988 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=395136991250 2026-03-08T22:45:37.988 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.6 seq 395136991250 2026-03-08T22:45:37.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 395136991250' 2026-03-08T22:45:37.989 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T22:45:38.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 395136991251 -lt 395136991250 2026-03-08T22:45:38.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:38.235 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691878 2026-03-08T22:45:38.235 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:38.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T22:45:38.236 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691878 2026-03-08T22:45:38.237 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:38.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691878 2026-03-08T22:45:38.237 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.7 seq 171798691878 2026-03-08T22:45:38.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691878' 2026-03-08T22:45:38.238 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T22:45:38.463 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691879 -lt 171798691878 2026-03-08T22:45:38.463 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:38.464 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528357 2026-03-08T22:45:38.464 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:38.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T22:45:38.465 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528357 2026-03-08T22:45:38.465 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:38.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528357 2026-03-08T22:45:38.466 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.8 seq 193273528357 2026-03-08T22:45:38.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528357' 2026-03-08T22:45:38.466 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T22:45:38.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528357 -lt 193273528357 2026-03-08T22:45:38.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:38.686 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-416611827726 2026-03-08T22:45:38.686 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:38.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T22:45:38.687 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-416611827726 2026-03-08T22:45:38.687 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:38.689 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=416611827726 2026-03-08T22:45:38.689 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 416611827726' 2026-03-08T22:45:38.689 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.9 seq 416611827726 2026-03-08T22:45:38.689 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T22:45:38.901 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 416611827727 -lt 416611827726 2026-03-08T22:45:38.901 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:38.902 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 10-236223201314 2026-03-08T22:45:38.902 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:38.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=10 2026-03-08T22:45:38.904 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 10-236223201314 2026-03-08T22:45:38.904 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:38.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=236223201314 2026-03-08T22:45:38.905 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.10 seq 236223201314 2026-03-08T22:45:38.905 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.10 seq 236223201314' 2026-03-08T22:45:38.905 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 10 2026-03-08T22:45:39.119 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 236223201315 -lt 236223201314 2026-03-08T22:45:39.120 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:45:39.120 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:45:39.120 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:45:39.413 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 28 == 0 2026-03-08T22:45:39.413 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:45:39.414 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:45:39.414 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:45:39.414 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:45:39.414 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:45:39.414 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:45:39.414 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:45:39.634 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=28 2026-03-08T22:45:39.634 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:45:39.634 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:45:39.635 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:45:39.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 28 = 28 2026-03-08T22:45:39.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:45:39.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:45:39.917 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:137: rados_osds_out_in: for osd in 0 1 2026-03-08T22:45:39.917 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:138: rados_osds_out_in: get_osds pool-jerasure SOMETHING 2026-03-08T22:45:39.917 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:138: rados_osds_out_in: grep '\<5\>' 2026-03-08T22:45:39.917 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:45:39.917 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T22:45:39.917 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure SOMETHING 2026-03-08T22:45:39.917 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:45:40.152 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=2 2026-03-08T22:45:40.152 INFO:tasks.workunit.client.0.vm04.stderr:7 2026-03-08T22:45:40.152 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:45:40.152 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:45:40.152 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:45:40.152 INFO:tasks.workunit.client.0.vm04.stderr:10' 2026-03-08T22:45:40.152 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 2 7 9 4 3 10 2026-03-08T22:45:40.152 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:137: rados_osds_out_in: for osd in 0 1 2026-03-08T22:45:40.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:138: rados_osds_out_in: get_osds pool-jerasure SOMETHING 2026-03-08T22:45:40.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:45:40.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:138: rados_osds_out_in: grep '\<1\>' 2026-03-08T22:45:40.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T22:45:40.153 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure SOMETHING 2026-03-08T22:45:40.154 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:45:40.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=2 2026-03-08T22:45:40.374 INFO:tasks.workunit.client.0.vm04.stderr:7 2026-03-08T22:45:40.374 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:45:40.374 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:45:40.374 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:45:40.374 INFO:tasks.workunit.client.0.vm04.stderr:10' 2026-03-08T22:45:40.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 2 7 9 4 3 10 2026-03-08T22:45:40.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:140: rados_osds_out_in: rados --pool pool-jerasure get SOMETHING td/test-erasure-code/COPY 2026-03-08T22:45:40.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:141: rados_osds_out_in: diff td/test-erasure-code/ORIGINAL td/test-erasure-code/COPY 2026-03-08T22:45:40.399 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:147: rados_osds_out_in: for osd in 0 1 2026-03-08T22:45:40.399 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:148: rados_osds_out_in: ceph osd in 5 2026-03-08T22:45:40.657 INFO:tasks.workunit.client.0.vm04.stderr:osd.5 is already in. 2026-03-08T22:45:40.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:147: rados_osds_out_in: for osd in 0 1 2026-03-08T22:45:40.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:148: rados_osds_out_in: ceph osd in 1 2026-03-08T22:45:41.062 INFO:tasks.workunit.client.0.vm04.stderr:osd.1 is already in. 2026-03-08T22:45:41.073 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:150: rados_osds_out_in: wait_for_clean 2026-03-08T22:45:41.073 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:45:41.073 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:45:41.073 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:45:41.073 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:45:41.074 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:45:41.074 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:45:41.074 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:45:41.074 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:45:41.074 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:45:41.151 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:45:41.151 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:45:41.151 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:45:41.151 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:45:41.151 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:45:41.152 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:45:41.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:45:41.427 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:45:41.427 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:45:41.427 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:45:41.427 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:45:41.427 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:45:41.427 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:45:41.427 INFO:tasks.workunit.client.0.vm04.stderr:7 2026-03-08T22:45:41.427 INFO:tasks.workunit.client.0.vm04.stderr:8 2026-03-08T22:45:41.427 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:45:41.427 INFO:tasks.workunit.client.0.vm04.stderr:10' 2026-03-08T22:45:41.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:45:41.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:41.427 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:45:41.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836532 2026-03-08T22:45:41.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836532 2026-03-08T22:45:41.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532' 2026-03-08T22:45:41.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:41.496 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:45:41.567 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673010 2026-03-08T22:45:41.567 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673010 2026-03-08T22:45:41.567 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532 1-42949673010' 2026-03-08T22:45:41.567 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:41.568 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:45:41.640 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509489 2026-03-08T22:45:41.640 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509489 2026-03-08T22:45:41.640 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532 1-42949673010 2-64424509489' 2026-03-08T22:45:41.640 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:41.640 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:45:41.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=309237645342 2026-03-08T22:45:41.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 309237645342 2026-03-08T22:45:41.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532 1-42949673010 2-64424509489 3-309237645342' 2026-03-08T22:45:41.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:41.716 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:45:41.792 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182446 2026-03-08T22:45:41.792 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182446 2026-03-08T22:45:41.792 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532 1-42949673010 2-64424509489 3-309237645342 4-107374182446' 2026-03-08T22:45:41.792 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:41.792 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:45:41.867 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=335007449114 2026-03-08T22:45:41.867 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 335007449114 2026-03-08T22:45:41.867 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532 1-42949673010 2-64424509489 3-309237645342 4-107374182446 5-335007449114' 2026-03-08T22:45:41.867 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:41.867 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T22:45:41.943 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=395136991254 2026-03-08T22:45:41.943 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 395136991254 2026-03-08T22:45:41.943 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532 1-42949673010 2-64424509489 3-309237645342 4-107374182446 5-335007449114 6-395136991254' 2026-03-08T22:45:41.943 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:41.945 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T22:45:42.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691882 2026-03-08T22:45:42.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691882 2026-03-08T22:45:42.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532 1-42949673010 2-64424509489 3-309237645342 4-107374182446 5-335007449114 6-395136991254 7-171798691882' 2026-03-08T22:45:42.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:42.019 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T22:45:42.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528360 2026-03-08T22:45:42.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528360 2026-03-08T22:45:42.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532 1-42949673010 2-64424509489 3-309237645342 4-107374182446 5-335007449114 6-395136991254 7-171798691882 8-193273528360' 2026-03-08T22:45:42.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:42.092 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T22:45:42.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=416611827730 2026-03-08T22:45:42.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 416611827730 2026-03-08T22:45:42.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532 1-42949673010 2-64424509489 3-309237645342 4-107374182446 5-335007449114 6-395136991254 7-171798691882 8-193273528360 9-416611827730' 2026-03-08T22:45:42.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:45:42.168 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.10 flush_pg_stats 2026-03-08T22:45:42.242 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=236223201318 2026-03-08T22:45:42.242 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 236223201318 2026-03-08T22:45:42.242 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532 1-42949673010 2-64424509489 3-309237645342 4-107374182446 5-335007449114 6-395136991254 7-171798691882 8-193273528360 9-416611827730 10-236223201318' 2026-03-08T22:45:42.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:42.243 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836532 2026-03-08T22:45:42.243 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:42.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:45:42.244 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836532 2026-03-08T22:45:42.245 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:42.245 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836532 2026-03-08T22:45:42.245 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836532' 2026-03-08T22:45:42.246 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 21474836532 2026-03-08T22:45:42.246 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:45:42.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836532 -lt 21474836532 2026-03-08T22:45:42.467 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:42.467 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673010 2026-03-08T22:45:42.467 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:42.468 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:45:42.468 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673010 2026-03-08T22:45:42.469 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:42.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673010 2026-03-08T22:45:42.469 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 42949673010 2026-03-08T22:45:42.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673010' 2026-03-08T22:45:42.470 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:45:42.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673010 -lt 42949673010 2026-03-08T22:45:42.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:42.692 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509489 2026-03-08T22:45:42.692 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:42.693 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:45:42.694 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509489 2026-03-08T22:45:42.694 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:42.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509489 2026-03-08T22:45:42.695 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 64424509489 2026-03-08T22:45:42.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509489' 2026-03-08T22:45:42.695 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:45:42.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509489 -lt 64424509489 2026-03-08T22:45:42.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:42.909 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-309237645342 2026-03-08T22:45:42.909 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:42.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:45:42.911 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-309237645342 2026-03-08T22:45:42.911 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:42.912 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=309237645342 2026-03-08T22:45:42.912 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 309237645342 2026-03-08T22:45:42.912 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 309237645342' 2026-03-08T22:45:42.912 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:45:43.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 309237645342 -lt 309237645342 2026-03-08T22:45:43.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:43.131 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182446 2026-03-08T22:45:43.131 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:43.132 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:45:43.132 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182446 2026-03-08T22:45:43.132 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:43.133 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182446 2026-03-08T22:45:43.133 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.4 seq 107374182446 2026-03-08T22:45:43.133 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182446' 2026-03-08T22:45:43.133 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:45:43.360 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182446 -lt 107374182446 2026-03-08T22:45:43.360 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:43.360 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-335007449114 2026-03-08T22:45:43.360 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:43.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:45:43.362 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-335007449114 2026-03-08T22:45:43.362 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:43.363 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=335007449114 2026-03-08T22:45:43.363 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.5 seq 335007449114 2026-03-08T22:45:43.363 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 335007449114' 2026-03-08T22:45:43.363 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:45:43.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 335007449114 -lt 335007449114 2026-03-08T22:45:43.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:43.582 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-395136991254 2026-03-08T22:45:43.582 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:43.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T22:45:43.584 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-395136991254 2026-03-08T22:45:43.584 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:43.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=395136991254 2026-03-08T22:45:43.585 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.6 seq 395136991254 2026-03-08T22:45:43.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 395136991254' 2026-03-08T22:45:43.585 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T22:45:43.820 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 395136991251 -lt 395136991254 2026-03-08T22:45:43.820 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:45:44.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:45:44.821 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T22:45:45.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 395136991254 -lt 395136991254 2026-03-08T22:45:45.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:45.048 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691882 2026-03-08T22:45:45.048 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:45.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T22:45:45.049 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691882 2026-03-08T22:45:45.049 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:45.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691882 2026-03-08T22:45:45.050 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.7 seq 171798691882 2026-03-08T22:45:45.051 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691882' 2026-03-08T22:45:45.051 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T22:45:45.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691882 -lt 171798691882 2026-03-08T22:45:45.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:45.281 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528360 2026-03-08T22:45:45.281 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:45.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T22:45:45.283 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528360 2026-03-08T22:45:45.283 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:45.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528360 2026-03-08T22:45:45.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528360' 2026-03-08T22:45:45.284 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.8 seq 193273528360 2026-03-08T22:45:45.284 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T22:45:45.514 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528360 -lt 193273528360 2026-03-08T22:45:45.514 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:45.514 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-416611827730 2026-03-08T22:45:45.514 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:45.515 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T22:45:45.516 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-416611827730 2026-03-08T22:45:45.516 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:45.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=416611827730 2026-03-08T22:45:45.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 416611827730' 2026-03-08T22:45:45.517 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.9 seq 416611827730 2026-03-08T22:45:45.517 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T22:45:45.758 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 416611827730 -lt 416611827730 2026-03-08T22:45:45.758 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:45:45.758 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 10-236223201318 2026-03-08T22:45:45.759 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:45:45.759 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=10 2026-03-08T22:45:45.760 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 10-236223201318 2026-03-08T22:45:45.760 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:45:45.761 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=236223201318 2026-03-08T22:45:45.761 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.10 seq 236223201318 2026-03-08T22:45:45.761 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.10 seq 236223201318' 2026-03-08T22:45:45.761 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 10 2026-03-08T22:45:46.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 236223201318 -lt 236223201318 2026-03-08T22:45:46.030 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:45:46.030 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:45:46.030 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:45:46.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 28 == 0 2026-03-08T22:45:46.433 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:45:46.433 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:45:46.433 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:45:46.433 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:45:46.433 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:45:46.433 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:45:46.434 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:45:46.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=28 2026-03-08T22:45:46.696 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:45:46.696 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:45:46.696 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:45:47.140 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 28 = 28 2026-03-08T22:45:47.140 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:45:47.140 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:45:47.141 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:151: rados_osds_out_in: get_osds pool-jerasure SOMETHING 2026-03-08T22:45:47.141 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:45:47.141 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T22:45:47.141 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure SOMETHING 2026-03-08T22:45:47.141 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:45:47.489 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=5 2026-03-08T22:45:47.489 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:45:47.489 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:45:47.489 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:45:47.489 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:45:47.489 INFO:tasks.workunit.client.0.vm04.stderr:10' 2026-03-08T22:45:47.489 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 5 1 9 4 3 10 2026-03-08T22:45:47.489 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:151: rados_osds_out_in: test '5 1 9 4 3 10' = '5 1 9 4 3 10' 2026-03-08T22:45:47.489 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:152: rados_osds_out_in: rm td/test-erasure-code/ORIGINAL 2026-03-08T22:45:47.490 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:229: TEST_rados_put_get_jerasure: delete_pool pool-jerasure 2026-03-08T22:45:47.490 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: delete_pool: local poolname=pool-jerasure 2026-03-08T22:45:47.490 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: delete_pool: ceph osd pool delete pool-jerasure pool-jerasure --yes-i-really-really-mean-it 2026-03-08T22:45:47.795 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' does not exist 2026-03-08T22:45:47.807 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:230: TEST_rados_put_get_jerasure: ceph osd erasure-code-profile rm profile-jerasure 2026-03-08T22:45:48.115 INFO:tasks.workunit.client.0.vm04.stderr:erasure-code-profile profile-jerasure does not exist 2026-03-08T22:45:48.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:47: run: for func in $funcs 2026-03-08T22:45:48.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:48: run: TEST_rados_put_get_lrc_advanced td/test-erasure-code 2026-03-08T22:45:48.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:156: TEST_rados_put_get_lrc_advanced: local dir=td/test-erasure-code 2026-03-08T22:45:48.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:157: TEST_rados_put_get_lrc_advanced: local poolname=pool-lrc-a 2026-03-08T22:45:48.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:158: TEST_rados_put_get_lrc_advanced: local profile=profile-lrc-a 2026-03-08T22:45:48.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:160: TEST_rados_put_get_lrc_advanced: ceph osd erasure-code-profile set profile-lrc-a plugin=lrc mapping=DD_ 'crush-steps=[ [ "chooseleaf", "osd", 0 ] ]' 'layers=[ [ "DDc", "" ] ]' 2026-03-08T22:45:48.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:165: TEST_rados_put_get_lrc_advanced: create_pool pool-lrc-a 12 12 erasure profile-lrc-a 2026-03-08T22:45:48.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create pool-lrc-a 12 12 erasure profile-lrc-a 2026-03-08T22:45:48.801 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-lrc-a' already exists 2026-03-08T22:45:48.812 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:45:49.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:168: TEST_rados_put_get_lrc_advanced: rados_put_get td/test-erasure-code pool-lrc-a 2026-03-08T22:45:49.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:66: rados_put_get: local dir=td/test-erasure-code 2026-03-08T22:45:49.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:67: rados_put_get: local poolname=pool-lrc-a 2026-03-08T22:45:49.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:68: rados_put_get: local objname=SOMETHING 2026-03-08T22:45:49.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:45:49.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 AAA 2026-03-08T22:45:49.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:45:49.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 BBB 2026-03-08T22:45:49.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:45:49.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 CCCC 2026-03-08T22:45:49.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:45:49.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 DDDD 2026-03-08T22:45:49.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:78: rados_put_get: rados --pool pool-lrc-a put SOMETHING td/test-erasure-code/ORIGINAL 2026-03-08T22:45:49.981 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:79: rados_put_get: rados --pool pool-lrc-a get SOMETHING td/test-erasure-code/COPY 2026-03-08T22:45:50.108 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:80: rados_put_get: diff td/test-erasure-code/ORIGINAL td/test-erasure-code/COPY 2026-03-08T22:45:50.109 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:81: rados_put_get: rm td/test-erasure-code/COPY 2026-03-08T22:45:50.110 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:88: rados_put_get: get_osds pool-lrc-a SOMETHING 2026-03-08T22:45:50.110 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-lrc-a 2026-03-08T22:45:50.110 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T22:45:50.111 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-lrc-a SOMETHING 2026-03-08T22:45:50.111 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:45:50.332 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=5 2026-03-08T22:45:50.332 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:45:50.332 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:45:50.332 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 5 9 3 2026-03-08T22:45:50.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:88: rados_put_get: initial_osds=('5' '9' '3') 2026-03-08T22:45:50.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:88: rados_put_get: local -a initial_osds 2026-03-08T22:45:50.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:89: rados_put_get: local last=2 2026-03-08T22:45:50.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:90: rados_put_get: ceph osd out 3 2026-03-08T22:45:50.615 INFO:tasks.workunit.client.0.vm04.stderr:osd.3 is already out. 2026-03-08T22:45:50.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:93: rados_put_get: sleep 5 2026-03-08T22:45:55.629 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:94: rados_put_get: get_osds pool-lrc-a SOMETHING 2026-03-08T22:45:55.629 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-lrc-a 2026-03-08T22:45:55.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T22:45:55.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:94: rados_put_get: grep '\<3\>' 2026-03-08T22:45:55.630 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-lrc-a SOMETHING 2026-03-08T22:45:55.630 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:45:55.864 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=5 2026-03-08T22:45:55.864 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:45:55.864 INFO:tasks.workunit.client.0.vm04.stderr:7' 2026-03-08T22:45:55.864 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 5 9 7 2026-03-08T22:45:55.864 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:96: rados_put_get: rados --pool pool-lrc-a get SOMETHING td/test-erasure-code/COPY 2026-03-08T22:45:55.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:97: rados_put_get: diff td/test-erasure-code/ORIGINAL td/test-erasure-code/COPY 2026-03-08T22:45:55.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:98: rados_put_get: ceph osd in 3 2026-03-08T22:45:56.212 INFO:tasks.workunit.client.0.vm04.stderr:osd.3 is already in. 2026-03-08T22:45:56.222 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:100: rados_put_get: rm td/test-erasure-code/ORIGINAL 2026-03-08T22:45:56.223 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:170: TEST_rados_put_get_lrc_advanced: delete_pool pool-lrc-a 2026-03-08T22:45:56.223 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: delete_pool: local poolname=pool-lrc-a 2026-03-08T22:45:56.223 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: delete_pool: ceph osd pool delete pool-lrc-a pool-lrc-a --yes-i-really-really-mean-it 2026-03-08T22:45:56.537 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-lrc-a' does not exist 2026-03-08T22:45:56.548 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:171: TEST_rados_put_get_lrc_advanced: ceph osd erasure-code-profile rm profile-lrc-a 2026-03-08T22:45:56.861 INFO:tasks.workunit.client.0.vm04.stderr:erasure-code-profile profile-lrc-a does not exist 2026-03-08T22:45:56.876 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:47: run: for func in $funcs 2026-03-08T22:45:56.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:48: run: TEST_rados_put_get_lrc_kml td/test-erasure-code 2026-03-08T22:45:56.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:175: TEST_rados_put_get_lrc_kml: local dir=td/test-erasure-code 2026-03-08T22:45:56.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:176: TEST_rados_put_get_lrc_kml: local poolname=pool-lrc 2026-03-08T22:45:56.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:177: TEST_rados_put_get_lrc_kml: local profile=profile-lrc 2026-03-08T22:45:56.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:179: TEST_rados_put_get_lrc_kml: ceph osd erasure-code-profile set profile-lrc plugin=lrc k=4 m=2 l=3 crush-failure-domain=osd 2026-03-08T22:45:57.181 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:183: TEST_rados_put_get_lrc_kml: create_pool pool-lrc 12 12 erasure profile-lrc 2026-03-08T22:45:57.181 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create pool-lrc 12 12 erasure profile-lrc 2026-03-08T22:45:57.889 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-lrc' already exists 2026-03-08T22:45:57.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:45:58.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:186: TEST_rados_put_get_lrc_kml: rados_put_get td/test-erasure-code pool-lrc 2026-03-08T22:45:58.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:66: rados_put_get: local dir=td/test-erasure-code 2026-03-08T22:45:58.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:67: rados_put_get: local poolname=pool-lrc 2026-03-08T22:45:58.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:68: rados_put_get: local objname=SOMETHING 2026-03-08T22:45:58.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:45:58.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 AAA 2026-03-08T22:45:58.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:45:58.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 BBB 2026-03-08T22:45:58.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:45:58.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 CCCC 2026-03-08T22:45:58.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:45:58.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 DDDD 2026-03-08T22:45:58.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:78: rados_put_get: rados --pool pool-lrc put SOMETHING td/test-erasure-code/ORIGINAL 2026-03-08T22:45:58.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:79: rados_put_get: rados --pool pool-lrc get SOMETHING td/test-erasure-code/COPY 2026-03-08T22:45:58.975 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:80: rados_put_get: diff td/test-erasure-code/ORIGINAL td/test-erasure-code/COPY 2026-03-08T22:45:58.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:81: rados_put_get: rm td/test-erasure-code/COPY 2026-03-08T22:45:58.977 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:88: rados_put_get: get_osds pool-lrc SOMETHING 2026-03-08T22:45:58.977 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-lrc 2026-03-08T22:45:58.977 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T22:45:58.977 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-lrc SOMETHING 2026-03-08T22:45:58.978 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:45:59.233 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=8 2026-03-08T22:45:59.233 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:45:59.233 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:45:59.233 INFO:tasks.workunit.client.0.vm04.stderr:10 2026-03-08T22:45:59.233 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:45:59.233 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:45:59.233 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:45:59.233 INFO:tasks.workunit.client.0.vm04.stderr:7' 2026-03-08T22:45:59.233 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 8 6 0 10 5 3 9 7 2026-03-08T22:45:59.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:88: rados_put_get: initial_osds=('8' '6' '0' '10' '5' '3' '9' '7') 2026-03-08T22:45:59.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:88: rados_put_get: local -a initial_osds 2026-03-08T22:45:59.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:89: rados_put_get: local last=7 2026-03-08T22:45:59.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:90: rados_put_get: ceph osd out 7 2026-03-08T22:45:59.492 INFO:tasks.workunit.client.0.vm04.stderr:osd.7 is already out. 2026-03-08T22:45:59.502 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:93: rados_put_get: sleep 5 2026-03-08T22:46:04.505 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:94: rados_put_get: get_osds pool-lrc SOMETHING 2026-03-08T22:46:04.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-lrc 2026-03-08T22:46:04.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T22:46:04.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:94: rados_put_get: grep '\<7\>' 2026-03-08T22:46:04.506 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-lrc SOMETHING 2026-03-08T22:46:04.506 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:46:04.759 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=8 2026-03-08T22:46:04.759 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:46:04.759 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:46:04.759 INFO:tasks.workunit.client.0.vm04.stderr:10 2026-03-08T22:46:04.759 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:46:04.759 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:46:04.759 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:46:04.759 INFO:tasks.workunit.client.0.vm04.stderr:4' 2026-03-08T22:46:04.759 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 8 6 0 10 5 3 9 4 2026-03-08T22:46:04.759 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:96: rados_put_get: rados --pool pool-lrc get SOMETHING td/test-erasure-code/COPY 2026-03-08T22:46:04.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:97: rados_put_get: diff td/test-erasure-code/ORIGINAL td/test-erasure-code/COPY 2026-03-08T22:46:04.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:98: rados_put_get: ceph osd in 7 2026-03-08T22:46:05.368 INFO:tasks.workunit.client.0.vm04.stderr:osd.7 is already in. 2026-03-08T22:46:05.433 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:100: rados_put_get: rm td/test-erasure-code/ORIGINAL 2026-03-08T22:46:05.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:188: TEST_rados_put_get_lrc_kml: delete_pool pool-lrc 2026-03-08T22:46:05.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: delete_pool: local poolname=pool-lrc 2026-03-08T22:46:05.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: delete_pool: ceph osd pool delete pool-lrc pool-lrc --yes-i-really-really-mean-it 2026-03-08T22:46:05.795 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-lrc' does not exist 2026-03-08T22:46:05.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:189: TEST_rados_put_get_lrc_kml: ceph osd erasure-code-profile rm profile-lrc 2026-03-08T22:46:06.135 INFO:tasks.workunit.client.0.vm04.stderr:erasure-code-profile profile-lrc does not exist 2026-03-08T22:46:06.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:47: run: for func in $funcs 2026-03-08T22:46:06.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:48: run: TEST_rados_put_get_shec td/test-erasure-code 2026-03-08T22:46:06.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:234: TEST_rados_put_get_shec: local dir=td/test-erasure-code 2026-03-08T22:46:06.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:236: TEST_rados_put_get_shec: local poolname=pool-shec 2026-03-08T22:46:06.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:237: TEST_rados_put_get_shec: local profile=profile-shec 2026-03-08T22:46:06.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:239: TEST_rados_put_get_shec: ceph osd erasure-code-profile set profile-shec plugin=shec k=2 m=1 c=1 crush-failure-domain=osd 2026-03-08T22:46:06.442 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:243: TEST_rados_put_get_shec: create_pool pool-shec 12 12 erasure profile-shec 2026-03-08T22:46:06.442 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create pool-shec 12 12 erasure profile-shec 2026-03-08T22:46:06.847 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-shec' already exists 2026-03-08T22:46:06.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:46:07.858 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:246: TEST_rados_put_get_shec: rados_put_get td/test-erasure-code pool-shec 2026-03-08T22:46:07.859 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:66: rados_put_get: local dir=td/test-erasure-code 2026-03-08T22:46:07.859 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:67: rados_put_get: local poolname=pool-shec 2026-03-08T22:46:07.859 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:68: rados_put_get: local objname=SOMETHING 2026-03-08T22:46:07.859 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:46:07.859 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 AAA 2026-03-08T22:46:07.859 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:46:07.859 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 BBB 2026-03-08T22:46:07.859 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:46:07.859 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 CCCC 2026-03-08T22:46:07.859 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:71: rados_put_get: for marker in AAA BBB CCCC DDDD 2026-03-08T22:46:07.859 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:72: rados_put_get: printf '%*s' 1024 DDDD 2026-03-08T22:46:07.859 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:78: rados_put_get: rados --pool pool-shec put SOMETHING td/test-erasure-code/ORIGINAL 2026-03-08T22:46:07.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:79: rados_put_get: rados --pool pool-shec get SOMETHING td/test-erasure-code/COPY 2026-03-08T22:46:07.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:80: rados_put_get: diff td/test-erasure-code/ORIGINAL td/test-erasure-code/COPY 2026-03-08T22:46:07.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:81: rados_put_get: rm td/test-erasure-code/COPY 2026-03-08T22:46:07.934 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:88: rados_put_get: get_osds pool-shec SOMETHING 2026-03-08T22:46:07.934 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-shec 2026-03-08T22:46:07.934 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T22:46:07.935 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-shec SOMETHING 2026-03-08T22:46:07.935 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:46:08.171 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=7 2026-03-08T22:46:08.171 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:46:08.171 INFO:tasks.workunit.client.0.vm04.stderr:1' 2026-03-08T22:46:08.171 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 7 9 1 2026-03-08T22:46:08.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:88: rados_put_get: initial_osds=('7' '9' '1') 2026-03-08T22:46:08.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:88: rados_put_get: local -a initial_osds 2026-03-08T22:46:08.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:89: rados_put_get: local last=2 2026-03-08T22:46:08.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:90: rados_put_get: ceph osd out 1 2026-03-08T22:46:08.515 INFO:tasks.workunit.client.0.vm04.stderr:osd.1 is already out. 2026-03-08T22:46:08.525 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:93: rados_put_get: sleep 5 2026-03-08T22:46:13.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:94: rados_put_get: get_osds pool-shec SOMETHING 2026-03-08T22:46:13.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:94: rados_put_get: grep '\<1\>' 2026-03-08T22:46:13.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-shec 2026-03-08T22:46:13.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T22:46:13.529 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-shec SOMETHING 2026-03-08T22:46:13.529 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:46:13.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=7 2026-03-08T22:46:13.767 INFO:tasks.workunit.client.0.vm04.stderr:9 2026-03-08T22:46:13.767 INFO:tasks.workunit.client.0.vm04.stderr:6' 2026-03-08T22:46:13.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 7 9 6 2026-03-08T22:46:13.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:96: rados_put_get: rados --pool pool-shec get SOMETHING td/test-erasure-code/COPY 2026-03-08T22:46:13.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:97: rados_put_get: diff td/test-erasure-code/ORIGINAL td/test-erasure-code/COPY 2026-03-08T22:46:13.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:98: rados_put_get: ceph osd in 1 2026-03-08T22:46:14.143 INFO:tasks.workunit.client.0.vm04.stderr:osd.1 is already in. 2026-03-08T22:46:14.152 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:100: rados_put_get: rm td/test-erasure-code/ORIGINAL 2026-03-08T22:46:14.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:248: TEST_rados_put_get_shec: delete_pool pool-shec 2026-03-08T22:46:14.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: delete_pool: local poolname=pool-shec 2026-03-08T22:46:14.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: delete_pool: ceph osd pool delete pool-shec pool-shec --yes-i-really-really-mean-it 2026-03-08T22:46:14.797 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-shec' does not exist 2026-03-08T22:46:14.807 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:249: TEST_rados_put_get_shec: ceph osd erasure-code-profile rm profile-shec 2026-03-08T22:46:15.102 INFO:tasks.workunit.client.0.vm04.stderr:erasure-code-profile profile-shec does not exist 2026-03-08T22:46:15.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:51: run: delete_pool ecpool 2026-03-08T22:46:15.120 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: delete_pool: local poolname=ecpool 2026-03-08T22:46:15.120 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: delete_pool: ceph osd pool delete ecpool ecpool --yes-i-really-really-mean-it 2026-03-08T22:46:15.393 INFO:tasks.workunit.client.0.vm04.stderr:pool 'ecpool' does not exist 2026-03-08T22:46:15.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-code.sh:52: run: teardown td/test-erasure-code 2026-03-08T22:46:15.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code 2026-03-08T22:46:15.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:46:15.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code KILL 2026-03-08T22:46:15.405 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:46:15.405 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:46:15.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:46:15.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:46:15.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:46:15.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:46:15.554 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:46:15.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:46:15.555 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:46:15.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:46:15.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:46:15.556 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:46:15.557 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:46:15.557 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:46:15.557 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:46:15.557 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:46:15.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:46:15.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:46:15.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code 2026-03-08T22:46:15.633 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:46:15.633 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:15.633 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:46:15.633 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.74377 2026-03-08T22:46:15.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:46:15.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:46:15.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2377: main: code=0 2026-03-08T22:46:15.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2381: main: teardown td/test-erasure-code 0 2026-03-08T22:46:15.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-code 2026-03-08T22:46:15.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs=0 2026-03-08T22:46:15.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-code KILL 2026-03-08T22:46:15.635 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:46:15.635 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:46:15.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:46:15.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:46:15.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:46:15.638 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:46:15.638 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:46:15.638 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:46:15.639 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:46:15.640 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:46:15.640 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:46:15.640 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:46:15.640 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:46:15.640 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:46:15.641 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:46:15.642 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:46:15.642 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:46:15.643 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o 0 = 1 ']' 2026-03-08T22:46:15.643 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-code 2026-03-08T22:46:15.644 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:46:15.644 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:15.644 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.74377 2026-03-08T22:46:15.644 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.74377 2026-03-08T22:46:15.645 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:46:15.645 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:46:15.645 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2382: main: return 0 2026-03-08T22:46:15.646 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-08T22:46:15.646 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-08T22:46:15.711 INFO:tasks.workunit:Running workunit erasure-code/test-erasure-eio.sh... 2026-03-08T22:46:15.711 DEBUG:teuthology.orchestra.run.vm04:workunit test erasure-code/test-erasure-eio.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh 2026-03-08T22:46:15.775 INFO:tasks.workunit.client.0.vm04.stderr:stty: 'standard input': Inappropriate ioctl for device 2026-03-08T22:46:15.780 INFO:tasks.workunit.client.0.vm04.stderr:+ PS4='${BASH_SOURCE[0]}:$LINENO: ${FUNCNAME[0]}: ' 2026-03-08T22:46:15.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: export PATH=.:/home/ubuntu/.local/bin:/home/ubuntu/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/sbin 2026-03-08T22:46:15.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: PATH=.:/home/ubuntu/.local/bin:/home/ubuntu/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/sbin 2026-03-08T22:46:15.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: export PYTHONWARNINGS=ignore 2026-03-08T22:46:15.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: PYTHONWARNINGS=ignore 2026-03-08T22:46:15.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: export CEPH_CONF=/dev/null 2026-03-08T22:46:15.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: CEPH_CONF=/dev/null 2026-03-08T22:46:15.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2373: main: unset CEPH_ARGS 2026-03-08T22:46:15.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2375: main: local code 2026-03-08T22:46:15.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2376: main: run td/test-erasure-eio 2026-03-08T22:46:15.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:22: run: local dir=td/test-erasure-eio 2026-03-08T22:46:15.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:23: run: shift 2026-03-08T22:46:15.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:25: run: export CEPH_MON=127.0.0.1:7112 2026-03-08T22:46:15.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:25: run: CEPH_MON=127.0.0.1:7112 2026-03-08T22:46:15.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:26: run: export CEPH_ARGS 2026-03-08T22:46:15.780 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:27: run: uuidgen 2026-03-08T22:46:15.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:27: run: CEPH_ARGS+='--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none ' 2026-03-08T22:46:15.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:28: run: CEPH_ARGS+='--mon-host=127.0.0.1:7112 ' 2026-03-08T22:46:15.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:29: run: CEPH_ARGS+='--osd_mclock_override_recovery_settings=true ' 2026-03-08T22:46:15.782 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:31: run: set 2026-03-08T22:46:15.782 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:31: run: sed -n -e 's/^\(TEST_[0-9a-z_]*\) .*/\1/p' 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:31: run: local 'funcs=TEST_ec_backfill_unfound 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:TEST_ec_object_attr_read_error 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:TEST_ec_recovery_multiple_errors 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:TEST_ec_recovery_multiple_objects 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:TEST_ec_recovery_multiple_objects_eio 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:TEST_ec_recovery_unfound 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:TEST_ec_single_recovery_error 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:TEST_rados_get_bad_size_shard_0 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:TEST_rados_get_bad_size_shard_1 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:TEST_rados_get_subread_eio_shard_0 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:TEST_rados_get_subread_eio_shard_1 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:TEST_rados_get_subread_missing 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:TEST_rados_get_with_subreadall_eio_shard_0 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:TEST_rados_get_with_subreadall_eio_shard_1' 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:32: run: for func in $funcs 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:33: run: setup td/test-erasure-eio 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-eio 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-eio 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:46:15.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:46:15.786 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:46:15.786 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:46:15.787 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:46:15.787 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:46:15.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:46:15.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:46:15.789 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:46:15.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:46:15.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:46:15.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:46:15.790 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:46:15.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:46:15.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:46:15.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:46:15.793 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:46:15.793 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:15.793 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:46:15.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:46:15.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:46:15.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:46:15.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-eio 2026-03-08T22:46:15.795 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:46:15.795 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:15.795 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:46:15.795 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.102080 2026-03-08T22:46:15.796 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:46:15.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 1024 -le 1024 ']' 2026-03-08T22:46:15.797 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:136: setup: ulimit -n 4096 2026-03-08T22:46:15.797 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:46:15.797 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-eio 1' TERM HUP INT 2026-03-08T22:46:15.797 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:34: run: run_mon td/test-erasure-eio a 2026-03-08T22:46:15.797 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-eio 2026-03-08T22:46:15.797 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:46:15.797 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:46:15.797 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:46:15.797 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-eio/a 2026-03-08T22:46:15.797 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-eio/a --run-dir=td/test-erasure-eio 2026-03-08T22:46:15.826 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:46:15.827 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:46:15.827 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:46:15.827 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:46:15.827 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:15.827 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:46:15.827 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:46:15.827 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-eio/a '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-eio/log --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:46:15.861 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:46:15.862 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:46:15.862 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:46:15.862 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:46:15.862 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:46:15.862 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:46:15.862 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:46:15.862 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:46:15.862 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:46:15.864 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:46:15.864 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:15.864 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:46:15.864 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:46:15.864 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:46:15.864 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get fsid 2026-03-08T22:46:15.919 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:46:15.919 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:46:15.919 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:46:15.919 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:46:15.919 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:46:15.919 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:46:15.919 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:46:15.919 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:46:15.919 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:46:15.919 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:15.919 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:46:15.920 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:46:15.920 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:46:15.920 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get mon_host 2026-03-08T22:46:15.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:35: run: run_mgr td/test-erasure-eio x 2026-03-08T22:46:15.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-eio 2026-03-08T22:46:15.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:46:15.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:46:15.973 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:46:15.973 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-eio/x 2026-03-08T22:46:15.973 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:46:16.101 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:46:16.101 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:46:16.101 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:46:16.101 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:46:16.101 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:16.101 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:46:16.101 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:46:16.101 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:46:16.102 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-eio/x '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:46:16.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:36: run: create_pool rbd 4 2026-03-08T22:46:16.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:46:16.326 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd' already exists 2026-03-08T22:46:16.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:46:17.342 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: get_asok_path mon.a 2026-03-08T22:46:17.342 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:46:17.342 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:46:17.342 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:46:17.342 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:17.342 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:46:17.342 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:46:17.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: CEPH_ARGS= 2026-03-08T22:46:17.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-mon.a.asok log flush 2026-03-08T22:46:17.387 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:46:17.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:40: run: grep 'load: jerasure.*lrc' td/test-erasure-eio/mon.a.log 2026-03-08T22:46:17.394 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:46:15.851+0000 7ffa6e1ccd80 0 load: jerasure load: lrc 2026-03-08T22:46:17.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:41: run: TEST_ec_backfill_unfound td/test-erasure-eio 2026-03-08T22:46:17.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:511: TEST_ec_backfill_unfound: local dir=td/test-erasure-eio 2026-03-08T22:46:17.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:512: TEST_ec_backfill_unfound: local objname=myobject 2026-03-08T22:46:17.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:513: TEST_ec_backfill_unfound: local lastobj=300 2026-03-08T22:46:17.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:515: TEST_ec_backfill_unfound: local testobj=obj250 2026-03-08T22:46:17.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:517: TEST_ec_backfill_unfound: ORIG_ARGS='--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:46:17.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:518: TEST_ec_backfill_unfound: CEPH_ARGS+=' --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10' 2026-03-08T22:46:17.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:519: TEST_ec_backfill_unfound: setup_osds 5 2026-03-08T22:46:17.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:47: setup_osds: local count=5 2026-03-08T22:46:17.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:48: setup_osds: shift 2026-03-08T22:46:17.395 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: expr 5 - 1 2026-03-08T22:46:17.396 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: seq 0 4 2026-03-08T22:46:17.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:46:17.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 0 2026-03-08T22:46:17.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:46:17.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:46:17.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:46:17.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:46:17.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/0 2026-03-08T22:46:17.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10' 2026-03-08T22:46:17.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:46:17.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:46:17.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:46:17.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/0' 2026-03-08T22:46:17.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/0/journal' 2026-03-08T22:46:17.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:46:17.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:46:17.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:46:17.398 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:46:17.398 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:46:17.398 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:46:17.398 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:46:17.398 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:17.398 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:46:17.398 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:46:17.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:46:17.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:46:17.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:46:17.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:46:17.399 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:46:17.399 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:46:17.399 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:46:17.399 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:46:17.399 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:46:17.399 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:46:17.399 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:46:17.399 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:46:17.399 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/0 2026-03-08T22:46:17.400 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:46:17.401 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=ed9e2bbd-6a7a-4605-afda-0bb0dc854504 2026-03-08T22:46:17.401 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 ed9e2bbd-6a7a-4605-afda-0bb0dc854504' 2026-03-08T22:46:17.401 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 ed9e2bbd-6a7a-4605-afda-0bb0dc854504 2026-03-08T22:46:17.401 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:46:17.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQA5/K1pLH/yGBAAZZdvzyX0DHjK5gZMmzsJvQ== 2026-03-08T22:46:17.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQA5/K1pLH/yGBAAZZdvzyX0DHjK5gZMmzsJvQ=="}' 2026-03-08T22:46:17.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new ed9e2bbd-6a7a-4605-afda-0bb0dc854504 -i td/test-erasure-eio/0/new.json 2026-03-08T22:46:17.556 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:46:17.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/0/new.json 2026-03-08T22:46:17.573 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQA5/K1pLH/yGBAAZZdvzyX0DHjK5gZMmzsJvQ== --osd-uuid ed9e2bbd-6a7a-4605-afda-0bb0dc854504 2026-03-08T22:46:17.598 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:17.597+0000 7fbb4ccd9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:17.598 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:17.600+0000 7fbb4ccd9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:17.600 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:17.601+0000 7fbb4ccd9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:17.601 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:17.602+0000 7fbb4ccd9780 -1 bdev(0x5623c3a85c00 td/test-erasure-eio/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:46:17.601 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:17.602+0000 7fbb4ccd9780 -1 bluestore(td/test-erasure-eio/0) _read_fsid unparsable uuid 2026-03-08T22:46:20.226 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/0/keyring 2026-03-08T22:46:20.226 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:46:20.227 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:46:20.227 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:46:20.227 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:46:20.519 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:46:20.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:46:20.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:46:20.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:46:20.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:46:20.523 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:46:20.540 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:20.541+0000 7fdf2052c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:20.548 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:20.549+0000 7fdf2052c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:20.550 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:20.550+0000 7fdf2052c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:20.753 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:46:20.753 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:46:20.753 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:46:20.753 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:46:20.753 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:46:20.753 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:46:20.753 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:20.753 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:46:20.753 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:20.753 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:46:20.974 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:21.104 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:21.105+0000 7fdf2052c780 -1 Falling back to public interface 2026-03-08T22:46:21.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:21.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:21.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:46:21.976 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:46:21.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:46:21.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:21.984 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:21.985+0000 7fdf2052c780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:46:22.195 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:23.197 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:46:23.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:23.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:23.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:46:23.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:23.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:46:23.458 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:24.461 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:46:24.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:24.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:24.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:46:24.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:24.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:46:24.686 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 6 up_thru 6 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/113024798,v1:127.0.0.1:6803/113024798] [v2:127.0.0.1:6804/113024798,v1:127.0.0.1:6805/113024798] exists,up ed9e2bbd-6a7a-4605-afda-0bb0dc854504 2026-03-08T22:46:24.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:46:24.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:46:24.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:46:24.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:46:24.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 1 2026-03-08T22:46:24.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:46:24.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:46:24.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:46:24.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:46:24.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T22:46:24.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10' 2026-03-08T22:46:24.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:46:24.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:46:24.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:46:24.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T22:46:24.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T22:46:24.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:46:24.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:46:24.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T22:46:24.689 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:46:24.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=b2c88ca8-3a56-4e37-a53e-287229e64e4c 2026-03-08T22:46:24.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 b2c88ca8-3a56-4e37-a53e-287229e64e4c' 2026-03-08T22:46:24.690 INFO:tasks.workunit.client.0.vm04.stdout:add osd1 b2c88ca8-3a56-4e37-a53e-287229e64e4c 2026-03-08T22:46:24.691 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:46:24.710 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBA/K1peAZaKhAA1dqHY6qXHf1983OuFGVbjA== 2026-03-08T22:46:24.710 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBA/K1peAZaKhAA1dqHY6qXHf1983OuFGVbjA=="}' 2026-03-08T22:46:24.710 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new b2c88ca8-3a56-4e37-a53e-287229e64e4c -i td/test-erasure-eio/1/new.json 2026-03-08T22:46:24.990 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:46:25.003 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/1/new.json 2026-03-08T22:46:25.003 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBA/K1peAZaKhAA1dqHY6qXHf1983OuFGVbjA== --osd-uuid b2c88ca8-3a56-4e37-a53e-287229e64e4c 2026-03-08T22:46:25.025 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:25.026+0000 7efcc8d3f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:25.027 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:25.028+0000 7efcc8d3f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:25.028 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:25.029+0000 7efcc8d3f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:25.029 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:25.030+0000 7efcc8d3f780 -1 bdev(0x558d679ebc00 td/test-erasure-eio/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:46:25.029 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:25.030+0000 7efcc8d3f780 -1 bluestore(td/test-erasure-eio/1) _read_fsid unparsable uuid 2026-03-08T22:46:27.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/1/keyring 2026-03-08T22:46:27.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:46:27.423 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:46:27.423 INFO:tasks.workunit.client.0.vm04.stdout:adding osd1 key to auth repository 2026-03-08T22:46:27.423 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:46:27.733 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:46:27.733 INFO:tasks.workunit.client.0.vm04.stdout:start osd.1 2026-03-08T22:46:27.733 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:46:27.733 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:46:27.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:46:27.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:46:27.756 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:27.756+0000 7f996ba11780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:27.757 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:27.758+0000 7f996ba11780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:27.759 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:27.759+0000 7f996ba11780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:27.964 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:46:27.964 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:46:27.964 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:46:27.964 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:46:27.964 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:46:27.964 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:27.964 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:46:27.964 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:46:27.964 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:27.965 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:46:28.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:28.314 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:28.314+0000 7f996ba11780 -1 Falling back to public interface 2026-03-08T22:46:29.168 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:29.169+0000 7f996ba11780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:46:29.216 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:46:29.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:29.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:29.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:46:29.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:29.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:46:29.463 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:30.466 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:46:30.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:30.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:30.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:46:30.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:30.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:46:30.714 INFO:tasks.workunit.client.0.vm04.stdout:osd.1 up in weight 1 up_from 13 up_thru 13 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/3922432146,v1:127.0.0.1:6811/3922432146] [v2:127.0.0.1:6812/3922432146,v1:127.0.0.1:6813/3922432146] exists,up b2c88ca8-3a56-4e37-a53e-287229e64e4c 2026-03-08T22:46:30.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:46:30.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:46:30.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:46:30.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:46:30.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 2 2026-03-08T22:46:30.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:46:30.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:46:30.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:46:30.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:46:30.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T22:46:30.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10' 2026-03-08T22:46:30.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:46:30.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:46:30.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:46:30.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T22:46:30.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T22:46:30.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:46:30.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:46:30.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:46:30.715 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:46:30.716 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:46:30.716 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:46:30.716 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:46:30.716 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:30.716 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:46:30.716 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:46:30.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:46:30.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:46:30.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:46:30.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:46:30.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:46:30.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:46:30.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:46:30.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:46:30.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:46:30.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:46:30.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:46:30.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:46:30.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T22:46:30.718 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:46:30.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=9d09b8f7-7eb9-43b3-b8be-eb3ac2ab9132 2026-03-08T22:46:30.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 9d09b8f7-7eb9-43b3-b8be-eb3ac2ab9132' 2026-03-08T22:46:30.719 INFO:tasks.workunit.client.0.vm04.stdout:add osd2 9d09b8f7-7eb9-43b3-b8be-eb3ac2ab9132 2026-03-08T22:46:30.719 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:46:30.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBG/K1p9xXcKxAAmhtWKku5Ashby4Eq94PR/g== 2026-03-08T22:46:30.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBG/K1p9xXcKxAAmhtWKku5Ashby4Eq94PR/g=="}' 2026-03-08T22:46:30.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 9d09b8f7-7eb9-43b3-b8be-eb3ac2ab9132 -i td/test-erasure-eio/2/new.json 2026-03-08T22:46:30.965 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:46:30.975 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/2/new.json 2026-03-08T22:46:30.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBG/K1p9xXcKxAAmhtWKku5Ashby4Eq94PR/g== --osd-uuid 9d09b8f7-7eb9-43b3-b8be-eb3ac2ab9132 2026-03-08T22:46:30.998 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:30.999+0000 7f17799ab780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:30.999 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:31.001+0000 7f17799ab780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:31.000 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:31.002+0000 7f17799ab780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:31.001 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:31.002+0000 7f17799ab780 -1 bdev(0x55d05bcd3c00 td/test-erasure-eio/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:46:31.001 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:31.002+0000 7f17799ab780 -1 bluestore(td/test-erasure-eio/2) _read_fsid unparsable uuid 2026-03-08T22:46:33.634 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/2/keyring 2026-03-08T22:46:33.634 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:46:33.635 INFO:tasks.workunit.client.0.vm04.stdout:adding osd2 key to auth repository 2026-03-08T22:46:33.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:46:33.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:46:33.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:46:33.930 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T22:46:33.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:46:33.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:46:33.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:46:33.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:46:33.950 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:33.951+0000 7fbaf5cd2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:33.956 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:33.958+0000 7fbaf5cd2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:33.957 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:33.959+0000 7fbaf5cd2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:34.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:46:34.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:46:34.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:46:34.167 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:46:34.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:46:34.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:46:34.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:34.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:46:34.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:34.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:46:34.389 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:35.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:35.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:35.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:46:35.391 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:46:35.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:35.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:46:35.541 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:35.543+0000 7fbaf5cd2780 -1 Falling back to public interface 2026-03-08T22:46:35.620 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:36.416 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:36.417+0000 7fbaf5cd2780 -1 osd.2 0 log_to_monitors true 2026-03-08T22:46:36.621 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:36.621 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:36.621 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:46:36.622 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:46:36.623 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:36.623 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:46:36.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:37.797 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:37.798+0000 7fbaf1473640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T22:46:37.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:37.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:37.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:46:37.852 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:46:37.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:37.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:46:38.095 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up in weight 1 up_from 19 up_thru 19 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/2121329779,v1:127.0.0.1:6819/2121329779] [v2:127.0.0.1:6820/2121329779,v1:127.0.0.1:6821/2121329779] exists,up 9d09b8f7-7eb9-43b3-b8be-eb3ac2ab9132 2026-03-08T22:46:38.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:46:38.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:46:38.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:46:38.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:46:38.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 3 2026-03-08T22:46:38.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:46:38.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:46:38.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T22:46:38.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:46:38.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/3 2026-03-08T22:46:38.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10' 2026-03-08T22:46:38.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:46:38.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:46:38.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:46:38.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/3' 2026-03-08T22:46:38.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/3/journal' 2026-03-08T22:46:38.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:46:38.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:46:38.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:46:38.097 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:46:38.097 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:46:38.097 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:46:38.097 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:46:38.097 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:38.097 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:46:38.097 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:46:38.097 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:46:38.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:46:38.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:46:38.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:46:38.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:46:38.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:46:38.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:46:38.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:46:38.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:46:38.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:46:38.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:46:38.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:46:38.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/3 2026-03-08T22:46:38.099 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:46:38.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=0b2c077d-5992-4d66-a6b7-fffd1db14b8b 2026-03-08T22:46:38.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 0b2c077d-5992-4d66-a6b7-fffd1db14b8b' 2026-03-08T22:46:38.100 INFO:tasks.workunit.client.0.vm04.stdout:add osd3 0b2c077d-5992-4d66-a6b7-fffd1db14b8b 2026-03-08T22:46:38.100 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:46:38.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBO/K1pF2P3BhAARgTs/13VP0Ts4pDZQSu5cw== 2026-03-08T22:46:38.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBO/K1pF2P3BhAARgTs/13VP0Ts4pDZQSu5cw=="}' 2026-03-08T22:46:38.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 0b2c077d-5992-4d66-a6b7-fffd1db14b8b -i td/test-erasure-eio/3/new.json 2026-03-08T22:46:38.339 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:46:38.350 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/3/new.json 2026-03-08T22:46:38.350 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBO/K1pF2P3BhAARgTs/13VP0Ts4pDZQSu5cw== --osd-uuid 0b2c077d-5992-4d66-a6b7-fffd1db14b8b 2026-03-08T22:46:38.372 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:38.373+0000 7f88dbe10780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:38.374 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:38.375+0000 7f88dbe10780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:38.375 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:38.377+0000 7f88dbe10780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:38.376 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:38.377+0000 7f88dbe10780 -1 bdev(0x562d6cd1bc00 td/test-erasure-eio/3/block) open stat got: (1) Operation not permitted 2026-03-08T22:46:38.376 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:38.377+0000 7f88dbe10780 -1 bluestore(td/test-erasure-eio/3) _read_fsid unparsable uuid 2026-03-08T22:46:40.815 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/3/keyring 2026-03-08T22:46:40.815 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:46:40.815 INFO:tasks.workunit.client.0.vm04.stdout:adding osd3 key to auth repository 2026-03-08T22:46:40.815 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T22:46:40.815 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:46:41.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T22:46:41.101 INFO:tasks.workunit.client.0.vm04.stdout:start osd.3 2026-03-08T22:46:41.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:46:41.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:46:41.102 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:46:41.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:46:41.122 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:41.122+0000 7f5d025a9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:41.129 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:41.129+0000 7f5d025a9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:41.129 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:41.131+0000 7f5d025a9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:41.333 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T22:46:41.334 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:46:41.334 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:46:41.334 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:46:41.334 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:46:41.334 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:41.334 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:46:41.334 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:46:41.334 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:41.334 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:46:41.573 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:41.942 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:41.942+0000 7f5d025a9780 -1 Falling back to public interface 2026-03-08T22:46:42.575 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:46:42.575 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:42.575 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:42.575 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:46:42.575 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:42.575 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:46:42.793 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:42.794+0000 7f5d025a9780 -1 osd.3 0 log_to_monitors true 2026-03-08T22:46:42.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:43.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:43.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:43.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:46:43.847 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:46:43.848 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:43.848 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:46:44.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:45.216 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:46:45.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:45.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:45.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:46:45.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:45.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:46:45.445 INFO:tasks.workunit.client.0.vm04.stdout:osd.3 up in weight 1 up_from 27 up_thru 28 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/1000029668,v1:127.0.0.1:6827/1000029668] [v2:127.0.0.1:6828/1000029668,v1:127.0.0.1:6829/1000029668] exists,up 0b2c077d-5992-4d66-a6b7-fffd1db14b8b 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 4 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=4 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/4 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10' 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/4' 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/4/journal' 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:46:45.446 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:46:45.447 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:45.447 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:46:45.447 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:46:45.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:46:45.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:46:45.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:46:45.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:46:45.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:46:45.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:46:45.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:46:45.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:46:45.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:46:45.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:46:45.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:46:45.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:46:45.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/4 2026-03-08T22:46:45.449 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:46:45.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=3e6e814a-41e8-4d7f-af17-c6fc638bd90c 2026-03-08T22:46:45.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd4 3e6e814a-41e8-4d7f-af17-c6fc638bd90c' 2026-03-08T22:46:45.450 INFO:tasks.workunit.client.0.vm04.stdout:add osd4 3e6e814a-41e8-4d7f-af17-c6fc638bd90c 2026-03-08T22:46:45.450 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:46:45.464 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBV/K1pGJa/GxAAHgoRmJIMWCdipTkCzBe6rw== 2026-03-08T22:46:45.464 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBV/K1pGJa/GxAAHgoRmJIMWCdipTkCzBe6rw=="}' 2026-03-08T22:46:45.464 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 3e6e814a-41e8-4d7f-af17-c6fc638bd90c -i td/test-erasure-eio/4/new.json 2026-03-08T22:46:45.692 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-08T22:46:45.702 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/4/new.json 2026-03-08T22:46:45.703 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 4 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/4 --osd-journal=td/test-erasure-eio/4/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBV/K1pGJa/GxAAHgoRmJIMWCdipTkCzBe6rw== --osd-uuid 3e6e814a-41e8-4d7f-af17-c6fc638bd90c 2026-03-08T22:46:45.723 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:45.725+0000 7f359c4a4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:45.725 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:45.727+0000 7f359c4a4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:45.726 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:45.728+0000 7f359c4a4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:45.727 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:45.728+0000 7f359c4a4780 -1 bdev(0x5599e7389c00 td/test-erasure-eio/4/block) open stat got: (1) Operation not permitted 2026-03-08T22:46:45.727 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:45.728+0000 7f359c4a4780 -1 bluestore(td/test-erasure-eio/4) _read_fsid unparsable uuid 2026-03-08T22:46:48.151 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/4/keyring 2026-03-08T22:46:48.152 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:46:48.152 INFO:tasks.workunit.client.0.vm04.stdout:adding osd4 key to auth repository 2026-03-08T22:46:48.152 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd4 key to auth repository 2026-03-08T22:46:48.152 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/4/keyring auth add osd.4 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:46:48.468 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.4 2026-03-08T22:46:48.468 INFO:tasks.workunit.client.0.vm04.stdout:start osd.4 2026-03-08T22:46:48.468 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 4 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/4 --osd-journal=td/test-erasure-eio/4/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:46:48.468 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:46:48.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:46:48.471 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:46:48.489 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:48.489+0000 7f88d8a13780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:48.491 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:48.492+0000 7f88d8a13780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:48.493 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:48.493+0000 7f88d8a13780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:48.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 4 2026-03-08T22:46:48.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:46:48.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=4 2026-03-08T22:46:48.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:46:48.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:46:48.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:48.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:46:48.708 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:46:48.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:48.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:46:48.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:49.557 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:49.558+0000 7f88d8a13780 -1 Falling back to public interface 2026-03-08T22:46:49.946 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:46:49.946 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:49.946 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:49.947 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:46:49.947 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:49.947 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:46:50.169 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:50.426 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:50.427+0000 7f88d8a13780 -1 osd.4 0 log_to_monitors true 2026-03-08T22:46:51.172 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:46:51.172 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:51.172 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:51.172 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:46:51.172 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:51.172 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:46:51.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:51.795 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:46:51.796+0000 7f88d41b2640 -1 osd.4 0 waiting for initial osdmap 2026-03-08T22:46:52.447 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:46:52.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:52.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:52.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:46:52.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:52.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:46:52.705 INFO:tasks.workunit.client.0.vm04.stdout:osd.4 up in weight 1 up_from 35 up_thru 36 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6834/1157406435,v1:127.0.0.1:6835/1157406435] [v2:127.0.0.1:6836/1157406435,v1:127.0.0.1:6837/1157406435] exists,up 3e6e814a-41e8-4d7f-af17-c6fc638bd90c 2026-03-08T22:46:52.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:46:52.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:46:52.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:46:52.706 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: get_asok_path osd.0 2026-03-08T22:46:52.706 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:46:52.706 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:46:52.707 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:46:52.707 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:52.707 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:46:52.707 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.0.asok 2026-03-08T22:46:52.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: CEPH_ARGS= 2026-03-08T22:46:52.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.0.asok log flush 2026-03-08T22:46:52.758 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:46:52.764 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:56: setup_osds: grep 'load: jerasure.*lrc' td/test-erasure-eio/osd.0.log 2026-03-08T22:46:52.766 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:46:21.109+0000 7fdf2052c780 0 load: jerasure load: lrc 2026-03-08T22:46:52.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:520: TEST_ec_backfill_unfound: CEPH_ARGS='--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:46:52.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:522: TEST_ec_backfill_unfound: local poolname=pool-jerasure 2026-03-08T22:46:52.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:523: TEST_ec_backfill_unfound: create_erasure_coded_pool pool-jerasure 3 2 2026-03-08T22:46:52.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:67: create_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:46:52.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:68: create_erasure_coded_pool: shift 2026-03-08T22:46:52.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:69: create_erasure_coded_pool: local k=3 2026-03-08T22:46:52.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:70: create_erasure_coded_pool: shift 2026-03-08T22:46:52.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:71: create_erasure_coded_pool: local m=2 2026-03-08T22:46:52.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:72: create_erasure_coded_pool: shift 2026-03-08T22:46:52.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:74: create_erasure_coded_pool: ceph osd erasure-code-profile set myprofile plugin=jerasure k=3 m=2 crush-failure-domain=osd 2026-03-08T22:46:53.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:78: create_erasure_coded_pool: create_pool pool-jerasure 1 1 erasure myprofile 2026-03-08T22:46:53.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create pool-jerasure 1 1 erasure myprofile 2026-03-08T22:46:53.445 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' already exists 2026-03-08T22:46:53.457 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:46:54.458 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:80: create_erasure_coded_pool: wait_for_clean 2026-03-08T22:46:54.458 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:46:54.458 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:46:54.458 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:46:54.459 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:46:54.459 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:46:54.459 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:46:54.459 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:46:54.459 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:46:54.459 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:46:54.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:46:54.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:46:54.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:46:54.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:46:54.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:46:54.546 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:46:54.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:46:54.767 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:46:54.767 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:46:54.767 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:46:54.767 INFO:tasks.workunit.client.0.vm04.stderr:4' 2026-03-08T22:46:54.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:46:54.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:54.767 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:46:54.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803785 2026-03-08T22:46:54.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803785 2026-03-08T22:46:54.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803785' 2026-03-08T22:46:54.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:54.852 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:46:54.939 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574855 2026-03-08T22:46:54.939 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574855 2026-03-08T22:46:54.939 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803785 1-55834574855' 2026-03-08T22:46:54.939 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:54.940 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:46:55.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378630 2026-03-08T22:46:55.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378630 2026-03-08T22:46:55.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803785 1-55834574855 2-81604378630' 2026-03-08T22:46:55.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:55.029 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:46:55.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964116998 2026-03-08T22:46:55.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964116998 2026-03-08T22:46:55.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803785 1-55834574855 2-81604378630 3-115964116998' 2026-03-08T22:46:55.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:55.104 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:46:55.180 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855363 2026-03-08T22:46:55.180 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855363 2026-03-08T22:46:55.180 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803785 1-55834574855 2-81604378630 3-115964116998 4-150323855363' 2026-03-08T22:46:55.180 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:55.181 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803785 2026-03-08T22:46:55.181 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:55.182 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:46:55.183 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803785 2026-03-08T22:46:55.183 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:55.184 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803785 2026-03-08T22:46:55.184 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803785' 2026-03-08T22:46:55.184 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803785 2026-03-08T22:46:55.184 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:46:55.409 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803783 -lt 25769803785 2026-03-08T22:46:55.409 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:46:56.410 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:46:56.411 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:46:56.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803785 -lt 25769803785 2026-03-08T22:46:56.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:56.635 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574855 2026-03-08T22:46:56.635 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:56.636 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:46:56.637 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574855 2026-03-08T22:46:56.637 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:56.637 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574855 2026-03-08T22:46:56.637 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574855' 2026-03-08T22:46:56.638 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 55834574855 2026-03-08T22:46:56.638 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:46:56.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574856 -lt 55834574855 2026-03-08T22:46:56.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:56.850 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378630 2026-03-08T22:46:56.850 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:56.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:46:56.851 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378630 2026-03-08T22:46:56.851 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:56.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378630 2026-03-08T22:46:56.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378630' 2026-03-08T22:46:56.852 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 81604378630 2026-03-08T22:46:56.852 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:46:57.077 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378630 -lt 81604378630 2026-03-08T22:46:57.077 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:57.078 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964116998 2026-03-08T22:46:57.078 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:57.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:46:57.080 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964116998 2026-03-08T22:46:57.080 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:57.081 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964116998 2026-03-08T22:46:57.081 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964116998' 2026-03-08T22:46:57.081 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 115964116998 2026-03-08T22:46:57.081 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:46:57.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964116998 -lt 115964116998 2026-03-08T22:46:57.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:57.303 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-150323855363 2026-03-08T22:46:57.303 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:57.304 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:46:57.304 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-150323855363 2026-03-08T22:46:57.304 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:57.305 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855363 2026-03-08T22:46:57.305 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 150323855363' 2026-03-08T22:46:57.305 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.4 seq 150323855363 2026-03-08T22:46:57.306 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:46:57.528 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855363 -lt 150323855363 2026-03-08T22:46:57.528 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:46:57.528 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:57.528 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:46:57.825 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:46:57.825 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:46:57.825 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:46:57.826 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:46:57.826 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:46:57.826 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:46:57.826 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:46:57.826 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:46:58.036 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:46:58.037 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:46:58.037 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:46:58.037 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:46:58.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:46:58.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:46:58.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:46:58.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:525: TEST_ec_backfill_unfound: ceph pg dump pgs 2026-03-08T22:46:58.589 INFO:tasks.workunit.client.0.vm04.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:46:58.589 INFO:tasks.workunit.client.0.vm04.stdout:2.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:54.477886+0000 0'0 42:12 [3,1,4,0,2] 3 [3,1,4,0,2] 3 0'0 2026-03-08T22:46:53.389027+0000 0'0 2026-03-08T22:46:53.389027+0000 0 0 periodic scrub scheduled @ 2026-03-10T00:43:03.091178+0000 0 0 2026-03-08T22:46:58.589 INFO:tasks.workunit.client.0.vm04.stdout:1.3 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:52.194580+0000 0'0 43:21 [4,1,2] 4 [4,1,2] 4 0'0 2026-03-08T22:46:16.276943+0000 0'0 2026-03-08T22:46:16.276943+0000 0 0 periodic scrub scheduled @ 2026-03-10T01:54:30.490844+0000 0 0 2026-03-08T22:46:58.589 INFO:tasks.workunit.client.0.vm04.stdout:1.2 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:52.193917+0000 0'0 42:100 [0,4,1] 0 [0,4,1] 0 0'0 2026-03-08T22:46:16.276943+0000 0'0 2026-03-08T22:46:16.276943+0000 0 0 periodic scrub scheduled @ 2026-03-10T01:08:14.266814+0000 0 0 2026-03-08T22:46:58.589 INFO:tasks.workunit.client.0.vm04.stdout:1.1 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:52.193595+0000 0'0 43:21 [4,3,0] 4 [4,3,0] 4 0'0 2026-03-08T22:46:16.276943+0000 0'0 2026-03-08T22:46:16.276943+0000 0 0 periodic scrub scheduled @ 2026-03-10T05:16:06.319528+0000 0 0 2026-03-08T22:46:58.589 INFO:tasks.workunit.client.0.vm04.stdout:1.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:38.019351+0000 0'0 43:68 [1,0,2] 1 [1,0,2] 1 0'0 2026-03-08T22:46:16.276943+0000 0'0 2026-03-08T22:46:16.276943+0000 0 0 periodic scrub scheduled @ 2026-03-10T01:24:56.627823+0000 0 0 2026-03-08T22:46:58.589 INFO:tasks.workunit.client.0.vm04.stdout: 2026-03-08T22:46:58.589 INFO:tasks.workunit.client.0.vm04.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:46:58.589 INFO:tasks.workunit.client.0.vm04.stderr:dumped pgs 2026-03-08T22:46:58.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:527: TEST_ec_backfill_unfound: rados_put td/test-erasure-eio pool-jerasure myobject 2026-03-08T22:46:58.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T22:46:58.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T22:46:58.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=myobject 2026-03-08T22:46:58.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:46:58.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T22:46:58.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:46:58.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T22:46:58.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:46:58.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T22:46:58.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:46:58.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T22:46:58.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put myobject td/test-erasure-eio/ORIGINAL 2026-03-08T22:46:58.633 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:528: TEST_ec_backfill_unfound: get_primary pool-jerasure myobject 2026-03-08T22:46:58.633 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=pool-jerasure 2026-03-08T22:46:58.633 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=myobject 2026-03-08T22:46:58.633 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map pool-jerasure myobject 2026-03-08T22:46:58.633 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:46:58.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:528: TEST_ec_backfill_unfound: local primary=3 2026-03-08T22:46:58.863 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:530: TEST_ec_backfill_unfound: get_osds pool-jerasure myobject 2026-03-08T22:46:58.863 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:46:58.863 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=myobject 2026-03-08T22:46:58.863 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure myobject 2026-03-08T22:46:58.863 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:46:59.095 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:46:59.095 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:46:59.095 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:46:59.095 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:46:59.095 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T22:46:59.095 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 4 0 2 2026-03-08T22:46:59.095 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:530: TEST_ec_backfill_unfound: initial_osds=('3' '1' '4' '0' '2') 2026-03-08T22:46:59.095 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:530: TEST_ec_backfill_unfound: local -a initial_osds 2026-03-08T22:46:59.095 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:531: TEST_ec_backfill_unfound: local last_osd=2 2026-03-08T22:46:59.095 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:532: TEST_ec_backfill_unfound: kill_daemons td/test-erasure-eio TERM osd.2 2026-03-08T22:46:59.096 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:46:59.096 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:46:59.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:46:59.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:46:59.097 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:46:59.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:46:59.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:533: TEST_ec_backfill_unfound: ceph osd down 2 2026-03-08T22:46:59.442 INFO:tasks.workunit.client.0.vm04.stderr:osd.2 is already down. 2026-03-08T22:46:59.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:534: TEST_ec_backfill_unfound: ceph osd out 2 2026-03-08T22:46:59.734 INFO:tasks.workunit.client.0.vm04.stderr:osd.2 is already out. 2026-03-08T22:46:59.746 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:536: TEST_ec_backfill_unfound: ceph pg dump pgs 2026-03-08T22:46:59.957 INFO:tasks.workunit.client.0.vm04.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:46:59.958 INFO:tasks.workunit.client.0.vm04.stdout:2.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:54.477886+0000 0'0 42:12 [3,1,4,0,2] 3 [3,1,4,0,2] 3 0'0 2026-03-08T22:46:53.389027+0000 0'0 2026-03-08T22:46:53.389027+0000 0 0 periodic scrub scheduled @ 2026-03-10T00:43:03.091178+0000 0 0 2026-03-08T22:46:59.958 INFO:tasks.workunit.client.0.vm04.stdout:1.3 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:52.194580+0000 0'0 43:21 [4,1,2] 4 [4,1,2] 4 0'0 2026-03-08T22:46:16.276943+0000 0'0 2026-03-08T22:46:16.276943+0000 0 0 periodic scrub scheduled @ 2026-03-10T01:54:30.490844+0000 0 0 2026-03-08T22:46:59.958 INFO:tasks.workunit.client.0.vm04.stdout:1.2 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:52.193917+0000 0'0 44:104 [0,4,1] 0 [0,4,1] 0 0'0 2026-03-08T22:46:16.276943+0000 0'0 2026-03-08T22:46:16.276943+0000 0 0 periodic scrub scheduled @ 2026-03-10T01:08:14.266814+0000 0 0 2026-03-08T22:46:59.958 INFO:tasks.workunit.client.0.vm04.stdout:1.1 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:52.193595+0000 0'0 43:21 [4,3,0] 4 [4,3,0] 4 0'0 2026-03-08T22:46:16.276943+0000 0'0 2026-03-08T22:46:16.276943+0000 0 0 periodic scrub scheduled @ 2026-03-10T05:16:06.319528+0000 0 0 2026-03-08T22:46:59.958 INFO:tasks.workunit.client.0.vm04.stdout:1.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:38.019351+0000 0'0 43:68 [1,0,2] 1 [1,0,2] 1 0'0 2026-03-08T22:46:16.276943+0000 0'0 2026-03-08T22:46:16.276943+0000 0 0 periodic scrub scheduled @ 2026-03-10T01:24:56.627823+0000 0 0 2026-03-08T22:46:59.958 INFO:tasks.workunit.client.0.vm04.stdout: 2026-03-08T22:46:59.958 INFO:tasks.workunit.client.0.vm04.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:46:59.958 INFO:tasks.workunit.client.0.vm04.stderr:dumped pgs 2026-03-08T22:46:59.970 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:538: TEST_ec_backfill_unfound: dd if=/dev/urandom of=td/test-erasure-eio/ORIGINAL bs=1024 count=4 2026-03-08T22:46:59.971 INFO:tasks.workunit.client.0.vm04.stderr:4+0 records in 2026-03-08T22:46:59.971 INFO:tasks.workunit.client.0.vm04.stderr:4+0 records out 2026-03-08T22:46:59.971 INFO:tasks.workunit.client.0.vm04.stderr:4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00016521 s, 24.8 MB/s 2026-03-08T22:46:59.971 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: seq 1 300 2026-03-08T22:46:59.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:46:59.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj1 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.004 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.004 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj2 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj3 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj4 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.095 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.095 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj5 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj6 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj7 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.193 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.193 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj8 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj9 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj10 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj11 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.313 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.313 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj12 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.341 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.341 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj13 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.371 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj14 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.399 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.399 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj15 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.425 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.425 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj16 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj17 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj18 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.521 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.521 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj19 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.550 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.550 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj20 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj21 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj22 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.657 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.658 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj23 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj24 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj25 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.744 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.744 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj26 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj27 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.800 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.800 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj28 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj29 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.856 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.856 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj30 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj31 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.928 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.928 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj32 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.959 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.959 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj33 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:00.988 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:00.988 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj34 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.017 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.017 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj35 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj36 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj37 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj38 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj39 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.173 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.173 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj40 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj41 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj42 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.259 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.259 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj43 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj44 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.319 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.319 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj45 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.350 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.350 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj46 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj47 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.408 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.409 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj48 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj49 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.478 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.478 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj50 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj51 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.533 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.533 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj52 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj53 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj54 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.616 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.616 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj55 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.641 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.641 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj56 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.675 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.675 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj57 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.728 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.728 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj58 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.758 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.758 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj59 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj60 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj61 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj62 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.878 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.878 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj63 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj64 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj65 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj66 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:01.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:01.990 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj67 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj68 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj69 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.086 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.086 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj70 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj71 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj72 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.209 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.209 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj73 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.240 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.240 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj74 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.268 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj75 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj76 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj77 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.373 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.373 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj78 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj79 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj80 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.478 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj81 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.512 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.512 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj82 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.542 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.542 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj83 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.569 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.569 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj84 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.605 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.605 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj85 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.641 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.641 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj86 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.673 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.673 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj87 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj88 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.743 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.743 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj89 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.779 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.779 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj90 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.807 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.807 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj91 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.842 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.842 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj92 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.871 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.871 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj93 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj94 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.939 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.939 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj95 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj96 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:02.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:02.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj97 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj98 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj99 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj100 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.119 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.119 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj101 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj102 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj103 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.212 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.212 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj104 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.241 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.241 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj105 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.273 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.273 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj106 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj107 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj108 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.368 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.368 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj109 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj110 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj111 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj112 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.489 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.489 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj113 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.523 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.523 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj114 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.549 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.549 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj115 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj116 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.612 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.612 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj117 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.641 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.641 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj118 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj119 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj120 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.727 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.727 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj121 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.759 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.759 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj122 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj123 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj124 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.843 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.843 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj125 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.871 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.871 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj126 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj127 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj128 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:03.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:03.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj129 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.000 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.000 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj130 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj131 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj132 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.090 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.090 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj133 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.119 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.119 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj134 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj135 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.189 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.189 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj136 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj137 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.253 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.253 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj138 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj139 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj140 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.338 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.338 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj141 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj142 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.392 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.392 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj143 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.420 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.420 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj144 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj145 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.481 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.482 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj146 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.509 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.509 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj147 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.539 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.539 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj148 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.575 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.575 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj149 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.605 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.605 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj150 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.634 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.634 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj151 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj152 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj153 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj154 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.760 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.760 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj155 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.787 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.787 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj156 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj157 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj158 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.884 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.884 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj159 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.913 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.913 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj160 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.947 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.947 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj161 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:04.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:04.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj162 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj163 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj164 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.086 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.086 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj165 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.119 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.119 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj166 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj167 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj168 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.259 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj169 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj170 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.338 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.338 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj171 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.369 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.369 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj172 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj173 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj174 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.484 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.484 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj175 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj176 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj177 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj178 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.621 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.622 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj179 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj180 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj181 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj182 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj183 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj184 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.873 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.873 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj185 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.918 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.918 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj186 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:05.965 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:05.965 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj187 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj188 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.046 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.046 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj189 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj190 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.137 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.137 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj191 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj192 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.209 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.209 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj193 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.249 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.249 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj194 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj195 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.320 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.320 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj196 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.355 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.355 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj197 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.387 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.387 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj198 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj199 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj200 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.485 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.485 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj201 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.516 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.516 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj202 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj203 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj204 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.620 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.620 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj205 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.652 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.652 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj206 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj207 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.729 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.729 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj208 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj209 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.802 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj210 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj211 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.883 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.884 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj212 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj213 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:06.959 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:06.959 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj214 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj215 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj216 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj217 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj218 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.144 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.144 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj219 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.184 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj220 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.224 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.224 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj221 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj222 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.295 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.295 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj223 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.338 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.338 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj224 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.387 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.387 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj225 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj226 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj227 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj228 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.550 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.550 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj229 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj230 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.629 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.629 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj231 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj232 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.758 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.758 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj233 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.798 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.798 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj234 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.833 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.833 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj235 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.869 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.870 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj236 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj237 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.942 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.942 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj238 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:07.983 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:07.983 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj239 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.017 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.017 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj240 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj241 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj242 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.121 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.121 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj243 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.156 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.156 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj244 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj245 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj246 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.248 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj247 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.281 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.281 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj248 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.311 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.311 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj249 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj250 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj251 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj252 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj253 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj254 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj255 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.560 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.560 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj256 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.596 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.596 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj257 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.633 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.633 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj258 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj259 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj260 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.723 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.723 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj261 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj262 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj263 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj264 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj265 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.878 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.878 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj266 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.905 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.905 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj267 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj268 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.959 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.959 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj269 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:08.992 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:08.992 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj270 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj271 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj272 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj273 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj274 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.147 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.147 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj275 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.191 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.191 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj276 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.227 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.227 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj277 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.260 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.261 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj278 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj279 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj280 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj281 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.402 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.402 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj282 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj283 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.467 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.467 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj284 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.502 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.502 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj285 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.538 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.538 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj286 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.578 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj287 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.612 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.612 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj288 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.645 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.645 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj289 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj290 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.710 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.710 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj291 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.746 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.747 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj292 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj293 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.820 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.820 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj294 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj295 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.886 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.886 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj296 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.920 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.920 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj297 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:09.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:09.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj298 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:10.005 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:10.005 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj299 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:10.048 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:539: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:10.048 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:541: TEST_ec_backfill_unfound: rados --pool pool-jerasure put obj300 td/test-erasure-eio/ORIGINAL 2026-03-08T22:47:10.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:544: TEST_ec_backfill_unfound: inject_eio ec data pool-jerasure obj250 td/test-erasure-eio 0 2026-03-08T22:47:10.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=ec 2026-03-08T22:47:10.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T22:47:10.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=data 2026-03-08T22:47:10.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T22:47:10.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=pool-jerasure 2026-03-08T22:47:10.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T22:47:10.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=obj250 2026-03-08T22:47:10.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T22:47:10.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/test-erasure-eio 2026-03-08T22:47:10.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T22:47:10.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=0 2026-03-08T22:47:10.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T22:47:10.079 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds pool-jerasure obj250 2026-03-08T22:47:10.079 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:47:10.079 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj250 2026-03-08T22:47:10.079 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj250 2026-03-08T22:47:10.079 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:47:10.316 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:47:10.316 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:47:10.316 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:47:10.316 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:47:10.316 INFO:tasks.workunit.client.0.vm04.stderr:2147483647' 2026-03-08T22:47:10.316 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 4 0 2147483647 2026-03-08T22:47:10.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('3' '1' '4' '0' '2147483647') 2026-03-08T22:47:10.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T22:47:10.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=3 2026-03-08T22:47:10.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' ec '!=' ec ']' 2026-03-08T22:47:10.316 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/test-erasure-eio/3/type 2026-03-08T22:47:10.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T22:47:10.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 3 bluestore_debug_inject_read_err true 2026-03-08T22:47:10.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T22:47:10.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=3 2026-03-08T22:47:10.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T22:47:10.318 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T22:47:10.318 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T22:47:10.318 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.3 2026-03-08T22:47:10.318 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.3 2026-03-08T22:47:10.318 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.3 ']' 2026-03-08T22:47:10.319 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:47:10.319 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:10.319 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:47:10.319 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.3.asok 2026-03-08T22:47:10.319 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.102080/ceph-osd.3.asok config set bluestore_debug_inject_read_err true 2026-03-08T22:47:10.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T22:47:10.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T22:47:10.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T22:47:10.379 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.3 2026-03-08T22:47:10.379 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.3 2026-03-08T22:47:10.379 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.3 ']' 2026-03-08T22:47:10.379 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:47:10.379 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:10.379 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:47:10.380 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.3.asok 2026-03-08T22:47:10.380 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T22:47:10.380 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.3.asok injectdataerr pool-jerasure obj250 0 2026-03-08T22:47:10.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:545: TEST_ec_backfill_unfound: inject_eio ec data pool-jerasure obj250 td/test-erasure-eio 1 2026-03-08T22:47:10.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=ec 2026-03-08T22:47:10.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T22:47:10.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=data 2026-03-08T22:47:10.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T22:47:10.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=pool-jerasure 2026-03-08T22:47:10.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T22:47:10.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=obj250 2026-03-08T22:47:10.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T22:47:10.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/test-erasure-eio 2026-03-08T22:47:10.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T22:47:10.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=1 2026-03-08T22:47:10.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T22:47:10.441 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds pool-jerasure obj250 2026-03-08T22:47:10.441 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:47:10.441 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj250 2026-03-08T22:47:10.441 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj250 2026-03-08T22:47:10.441 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:47:10.671 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:47:10.671 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:47:10.671 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:47:10.671 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:47:10.671 INFO:tasks.workunit.client.0.vm04.stderr:2147483647' 2026-03-08T22:47:10.671 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 4 0 2147483647 2026-03-08T22:47:10.671 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('3' '1' '4' '0' '2147483647') 2026-03-08T22:47:10.671 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T22:47:10.671 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=1 2026-03-08T22:47:10.671 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' ec '!=' ec ']' 2026-03-08T22:47:10.671 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/test-erasure-eio/1/type 2026-03-08T22:47:10.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T22:47:10.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 1 bluestore_debug_inject_read_err true 2026-03-08T22:47:10.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T22:47:10.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=1 2026-03-08T22:47:10.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T22:47:10.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T22:47:10.672 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T22:47:10.673 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.1 2026-03-08T22:47:10.673 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:47:10.673 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:47:10.673 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:47:10.673 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:10.673 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:47:10.673 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.1.asok 2026-03-08T22:47:10.673 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.102080/ceph-osd.1.asok config set bluestore_debug_inject_read_err true 2026-03-08T22:47:10.726 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T22:47:10.726 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T22:47:10.726 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T22:47:10.727 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.1 2026-03-08T22:47:10.727 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:47:10.727 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:47:10.727 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:47:10.727 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:10.727 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:47:10.727 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.1.asok 2026-03-08T22:47:10.727 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T22:47:10.727 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.1.asok injectdataerr pool-jerasure obj250 1 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:547: TEST_ec_backfill_unfound: activate_osd td/test-erasure-eio 2 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:47:10.781 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:47:10.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:47:10.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:47:10.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:47:10.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:47:10.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:47:10.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:47:10.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:47:10.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:47:10.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:47:10.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:47:10.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T22:47:10.783 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T22:47:10.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T22:47:10.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:47:10.783 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/2/whoami 2026-03-08T22:47:10.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T22:47:10.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:47:10.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:47:10.786 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:47:10.803 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:47:10.803+0000 7f6059214780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:10.818 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:47:10.819+0000 7f6059214780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:10.820 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:47:10.820+0000 7f6059214780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:11.006 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T22:47:11.006 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:47:11.006 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:47:11.006 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:47:11.006 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:47:11.006 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:11.006 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:47:11.006 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:47:11.006 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:11.006 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:47:11.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:11.373 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:47:11.373+0000 7f6059214780 -1 Falling back to public interface 2026-03-08T22:47:12.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:12.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:12.220 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:47:12.221 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:47:12.221 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:12.221 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:47:12.442 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:13.031 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:47:13.032+0000 7f6059214780 -1 osd.2 43 log_to_monitors true 2026-03-08T22:47:13.443 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:13.443 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:13.443 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:47:13.443 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:47:13.444 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:13.444 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:47:13.665 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:14.254 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:47:14.255+0000 7f604fb02640 -1 osd.2 43 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:47:14.668 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:47:14.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:14.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:14.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:47:14.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:14.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:47:14.898 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up out weight 0 up_from 49 up_thru 19 down_at 44 last_clean_interval [19,43) [v2:127.0.0.1:6818/2760476814,v1:127.0.0.1:6819/2760476814] [v2:127.0.0.1:6820/2760476814,v1:127.0.0.1:6821/2760476814] exists,up 9d09b8f7-7eb9-43b3-b8be-eb3ac2ab9132 2026-03-08T22:47:14.898 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:47:14.898 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:47:14.898 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:47:14.898 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:548: TEST_ec_backfill_unfound: ceph osd in 2 2026-03-08T22:47:15.163 INFO:tasks.workunit.client.0.vm04.stderr:osd.2 is already in. 2026-03-08T22:47:15.172 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:550: TEST_ec_backfill_unfound: sleep 15 2026-03-08T22:47:30.174 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:552: TEST_ec_backfill_unfound: seq 1 240 2026-03-08T22:47:30.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:552: TEST_ec_backfill_unfound: for tmp in $(seq 1 240) 2026-03-08T22:47:30.175 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:553: TEST_ec_backfill_unfound: get_state 2.0 2026-03-08T22:47:30.175 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:60: get_state: local pgid=2.0 2026-03-08T22:47:30.175 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:61: get_state: local sname=state 2026-03-08T22:47:30.177 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:62: get_state: ceph --format json pg dump pgs 2026-03-08T22:47:30.177 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:63: get_state: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .state' 2026-03-08T22:47:30.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:553: TEST_ec_backfill_unfound: state=active+backfill_unfound+undersized+degraded+remapped 2026-03-08T22:47:30.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:554: TEST_ec_backfill_unfound: echo active+backfill_unfound+undersized+degraded+remapped 2026-03-08T22:47:30.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:554: TEST_ec_backfill_unfound: grep backfill_unfound 2026-03-08T22:47:30.398 INFO:tasks.workunit.client.0.vm04.stdout:active+backfill_unfound+undersized+degraded+remapped 2026-03-08T22:47:30.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:555: TEST_ec_backfill_unfound: '[' 0 = 0 ']' 2026-03-08T22:47:30.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:556: TEST_ec_backfill_unfound: break 2026-03-08T22:47:30.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:562: TEST_ec_backfill_unfound: ceph pg dump pgs 2026-03-08T22:47:30.597 INFO:tasks.workunit.client.0.vm04.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:47:30.597 INFO:tasks.workunit.client.0.vm04.stdout:2.0 301 1 28 0 1 1232896 0 0 101 200 101 active+backfill_unfound+undersized+degraded+remapped 2026-03-08T22:47:17.171418+0000 47'301 53:894 [3,1,4,0,2] 3 [3,1,4,0,NONE] 3 0'0 2026-03-08T22:46:53.389027+0000 0'0 2026-03-08T22:46:53.389027+0000 0 0 no scrub is scheduled 0 0 2026-03-08T22:47:30.597 INFO:tasks.workunit.client.0.vm04.stdout:1.3 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:47:15.253075+0000 0'0 53:56 [4,1,2] 4 [4,1,2] 4 0'0 2026-03-08T22:46:16.276943+0000 0'0 2026-03-08T22:46:16.276943+0000 0 0 periodic scrub scheduled @ 2026-03-10T04:58:18.413270+0000 0 0 2026-03-08T22:47:30.597 INFO:tasks.workunit.client.0.vm04.stdout:1.2 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:52.193917+0000 0'0 53:120 [0,4,1] 0 [0,4,1] 0 0'0 2026-03-08T22:46:16.276943+0000 0'0 2026-03-08T22:46:16.276943+0000 0 0 periodic scrub scheduled @ 2026-03-10T01:08:14.266814+0000 0 0 2026-03-08T22:47:30.597 INFO:tasks.workunit.client.0.vm04.stdout:1.1 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:52.193595+0000 0'0 53:41 [4,3,0] 4 [4,3,0] 4 0'0 2026-03-08T22:46:16.276943+0000 0'0 2026-03-08T22:46:16.276943+0000 0 0 periodic scrub scheduled @ 2026-03-10T05:16:06.319528+0000 0 0 2026-03-08T22:47:30.597 INFO:tasks.workunit.client.0.vm04.stdout:1.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:47:15.253135+0000 0'0 53:102 [1,0,2] 1 [1,0,2] 1 0'0 2026-03-08T22:46:16.276943+0000 0'0 2026-03-08T22:46:16.276943+0000 0 0 periodic scrub scheduled @ 2026-03-10T05:14:23.882143+0000 0 0 2026-03-08T22:47:30.597 INFO:tasks.workunit.client.0.vm04.stdout: 2026-03-08T22:47:30.597 INFO:tasks.workunit.client.0.vm04.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:47:30.597 INFO:tasks.workunit.client.0.vm04.stderr:dumped pgs 2026-03-08T22:47:30.607 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:563: TEST_ec_backfill_unfound: kill_daemons td/test-erasure-eio TERM osd.2 2026-03-08T22:47:30.608 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:47:30.608 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:47:30.608 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:47:30.608 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:47:30.608 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:47:30.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:47:30.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:564: TEST_ec_backfill_unfound: sleep 5 2026-03-08T22:47:35.917 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:566: TEST_ec_backfill_unfound: ceph pg dump pgs 2026-03-08T22:47:36.121 INFO:tasks.workunit.client.0.vm04.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:47:36.121 INFO:tasks.workunit.client.0.vm04.stdout:2.0 301 1 28 0 1 1232896 0 0 101 200 101 active+backfill_unfound+undersized+degraded+remapped 2026-03-08T22:47:17.171418+0000 47'301 53:894 [3,1,4,0,2] 3 [3,1,4,0,NONE] 3 0'0 2026-03-08T22:46:53.389027+0000 0'0 2026-03-08T22:46:53.389027+0000 0 0 no scrub is scheduled 0 0 2026-03-08T22:47:36.121 INFO:tasks.workunit.client.0.vm04.stdout:1.3 0 0 0 0 0 0 0 0 0 0 0 active+undersized 2026-03-08T22:47:30.871150+0000 0'0 55:63 [4,1] 4 [4,1] 4 0'0 2026-03-08T22:46:16.276943+0000 0'0 2026-03-08T22:46:16.276943+0000 0 0 no scrub is scheduled 0 0 2026-03-08T22:47:36.121 INFO:tasks.workunit.client.0.vm04.stdout:1.2 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:52.193917+0000 0'0 55:124 [0,4,1] 0 [0,4,1] 0 0'0 2026-03-08T22:46:16.276943+0000 0'0 2026-03-08T22:46:16.276943+0000 0 0 periodic scrub scheduled @ 2026-03-10T01:08:14.266814+0000 0 0 2026-03-08T22:47:36.121 INFO:tasks.workunit.client.0.vm04.stdout:1.1 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:52.193595+0000 0'0 55:45 [4,3,0] 4 [4,3,0] 4 0'0 2026-03-08T22:46:16.276943+0000 0'0 2026-03-08T22:46:16.276943+0000 0 0 periodic scrub scheduled @ 2026-03-10T05:16:06.319528+0000 0 0 2026-03-08T22:47:36.121 INFO:tasks.workunit.client.0.vm04.stdout:1.0 0 0 0 0 0 0 0 0 0 0 0 active+undersized 2026-03-08T22:47:30.868983+0000 0'0 55:109 [1,0] 1 [1,0] 1 0'0 2026-03-08T22:46:16.276943+0000 0'0 2026-03-08T22:46:16.276943+0000 0 0 no scrub is scheduled 0 0 2026-03-08T22:47:36.121 INFO:tasks.workunit.client.0.vm04.stdout: 2026-03-08T22:47:36.121 INFO:tasks.workunit.client.0.vm04.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:47:36.122 INFO:tasks.workunit.client.0.vm04.stderr:dumped pgs 2026-03-08T22:47:36.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:567: TEST_ec_backfill_unfound: ceph pg 2.0 list_unfound 2026-03-08T22:47:36.199 INFO:tasks.workunit.client.0.vm04.stdout:{ 2026-03-08T22:47:36.199 INFO:tasks.workunit.client.0.vm04.stdout: "num_missing": 1, 2026-03-08T22:47:36.199 INFO:tasks.workunit.client.0.vm04.stdout: "num_unfound": 1, 2026-03-08T22:47:36.199 INFO:tasks.workunit.client.0.vm04.stdout: "objects": [ 2026-03-08T22:47:36.199 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:47:36.199 INFO:tasks.workunit.client.0.vm04.stdout: "oid": { 2026-03-08T22:47:36.199 INFO:tasks.workunit.client.0.vm04.stdout: "oid": "obj250", 2026-03-08T22:47:36.199 INFO:tasks.workunit.client.0.vm04.stdout: "key": "", 2026-03-08T22:47:36.199 INFO:tasks.workunit.client.0.vm04.stdout: "snapid": -2, 2026-03-08T22:47:36.199 INFO:tasks.workunit.client.0.vm04.stdout: "hash": 2249616407, 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: "max": 0, 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: "pool": 2, 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: "namespace": "" 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: "need": "47'251", 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: "have": "0'0", 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: "flags": "none", 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: "clean_regions": "clean_offsets: [], clean_omap: false, new_object: true", 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: "locations": [ 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: "0(3)", 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: "4(2)" 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: ] 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: "state": "NotRecovering", 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: "available_might_have_unfound": true, 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: "might_have_unfound": [ 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: "osd": "2(4)", 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: "status": "osd is down" 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout: "more": false 2026-03-08T22:47:36.200 INFO:tasks.workunit.client.0.vm04.stdout:} 2026-03-08T22:47:36.207 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:568: TEST_ec_backfill_unfound: ceph pg 2.0 query 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout:{ 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: "snap_trimq": "[]", 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: "snap_trimq_len": 0, 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: "state": "active+recovery_unfound+undersized+degraded", 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: "epoch": 55, 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: "up": [ 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: 2147483647 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: "acting": [ 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: 2147483647 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: "acting_recovery_backfill": [ 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: "0(3)", 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: "1(1)", 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: "3(0)", 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: "4(2)" 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: "info": { 2026-03-08T22:47:36.275 INFO:tasks.workunit.client.0.vm04.stdout: "pgid": "2.0s0", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "shared": "0", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_update": "47'301", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_complete": "47'250", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "log_tail": "47'200", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_user_version": 301, 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_backfill": "MAX", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "purged_snaps": [], 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "history": { 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "epoch_created": 41, 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "epoch_pool_created": 41, 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_started": 55, 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_interval_started": 54, 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_clean": 42, 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_interval_clean": 41, 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_split": 0, 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_marked_full": 0, 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "same_up_since": 54, 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "same_interval_since": 54, 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "same_primary_since": 41, 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub": "0'0", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub": "0'0", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "prior_readable_until_ub": 0 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "stats": { 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "version": "47'301", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "reported_seq": 907, 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "reported_epoch": 55, 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "state": "active+recovery_unfound+undersized+degraded", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_fresh": "2026-03-08T22:47:30.875546+0000", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_change": "2026-03-08T22:47:30.875546+0000", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_active": "2026-03-08T22:47:30.875546+0000", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_peered": "2026-03-08T22:47:30.875546+0000", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean": "2026-03-08T22:46:58.630673+0000", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_became_active": "2026-03-08T22:47:30.872156+0000", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_became_peered": "2026-03-08T22:47:30.872156+0000", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_unstale": "2026-03-08T22:47:30.875546+0000", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_undegraded": "2026-03-08T22:47:30.867396+0000", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_fullsized": "2026-03-08T22:47:30.867218+0000", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "mapping_epoch": 54, 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "log_start": "47'200", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "ondisk_log_start": "47'200", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "created": 41, 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_clean": 42, 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "parent": "0.0", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "parent_split_bits": 0, 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub": "0'0", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub": "0'0", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.276 INFO:tasks.workunit.client.0.vm04.stdout: "objects_scrubbed": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "log_size": 101, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "log_dups_size": 200, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "ondisk_log_size": 101, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "stats_invalid": false, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "dirty_stats_invalid": false, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "omap_stats_invalid": false, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "hitset_stats_invalid": false, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "hitset_bytes_stats_invalid": false, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "pin_stats_invalid": false, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "manifest_stats_invalid": false, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "snaptrimq_len": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_duration": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "scrub_schedule": "no scrub is scheduled", 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "scrub_duration": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "objects_trimmed": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "snaptrim_duration": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "stat_sum": { 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes": 1232896, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects": 301, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_object_clones": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_object_copies": 1505, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_missing_on_primary": 1, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_missing": 1, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_degraded": 303, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_misplaced": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_unfound": 1, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_dirty": 301, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_whiteouts": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_read": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_read_kb": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_write": 301, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_write_kb": 1204, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_scrub_errors": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_shallow_scrub_errors": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_deep_scrub_errors": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_recovered": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes_recovered": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_keys_recovered": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_omap": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_hit_set_archive": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes_hit_set_archive": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_kb": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_kb": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_promote": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_mode_high": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_mode_low": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_mode_some": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_mode_full": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_pinned": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_legacy_snapsets": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_large_omap_objects": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_manifest": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_omap_bytes": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_omap_keys": 0, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_repaired": 0 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: "up": [ 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:47:36.277 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: 2147483647 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "acting": [ 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: 2147483647 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "avail_no_missing": [ 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "0(3)", 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "4(2)" 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "object_location_counts": [ 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "shards": "0(3),1(1),3(0),4(2)", 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "objects": 300 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "shards": "0(3),4(2)", 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "objects": 1 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "blocked_by": [], 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "up_primary": 3, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "acting_primary": 3, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "purged_snaps": [] 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "empty": 0, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "dne": 0, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "incomplete": 0, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_started": 55, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "hit_set_history": { 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "current_last_update": "0'0", 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "history": [] 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "peer_info": [ 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "peer": "0(3)", 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "pgid": "2.0s3", 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "shared": "3", 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "last_update": "47'301", 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "last_complete": "47'301", 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "log_tail": "47'200", 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "last_user_version": 301, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "last_backfill": "MAX", 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "purged_snaps": [], 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "history": { 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "epoch_created": 41, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "epoch_pool_created": 41, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_started": 55, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "last_interval_started": 54, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_clean": 42, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "last_interval_clean": 41, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_split": 0, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_marked_full": 0, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "same_up_since": 54, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "same_interval_since": 54, 2026-03-08T22:47:36.278 INFO:tasks.workunit.client.0.vm04.stdout: "same_primary_since": 41, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub": "0'0", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub": "0'0", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "prior_readable_until_ub": 0 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "stats": { 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "version": "47'301", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "reported_seq": 897, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "reported_epoch": 54, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "state": "active+undersized+degraded", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_fresh": "2026-03-08T22:47:10.077221+0000", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_change": "2026-03-08T22:46:59.277911+0000", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_active": "2026-03-08T22:47:10.077221+0000", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_peered": "2026-03-08T22:47:10.077221+0000", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean": "2026-03-08T22:46:58.630673+0000", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_became_active": "2026-03-08T22:46:59.277911+0000", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_became_peered": "2026-03-08T22:46:59.277911+0000", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_unstale": "2026-03-08T22:47:10.077221+0000", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_undegraded": "2026-03-08T22:46:59.269575+0000", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_fullsized": "2026-03-08T22:46:59.269487+0000", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "mapping_epoch": 54, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "log_start": "47'200", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "ondisk_log_start": "47'200", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "created": 41, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_clean": 42, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "parent": "0.0", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "parent_split_bits": 0, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub": "0'0", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub": "0'0", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "objects_scrubbed": 0, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "log_size": 101, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "log_dups_size": 200, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "ondisk_log_size": 101, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "stats_invalid": false, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "dirty_stats_invalid": false, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "omap_stats_invalid": false, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "hitset_stats_invalid": false, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "hitset_bytes_stats_invalid": false, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "pin_stats_invalid": false, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "manifest_stats_invalid": false, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "snaptrimq_len": 0, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_duration": 0, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "scrub_schedule": "periodic scrub scheduled @ 2026-03-10T08:00:21.176151+0000", 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "scrub_duration": 0, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "objects_trimmed": 0, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "snaptrim_duration": 0, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "stat_sum": { 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes": 1232896, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects": 301, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "num_object_clones": 0, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "num_object_copies": 1505, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_missing_on_primary": 0, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_missing": 0, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_degraded": 301, 2026-03-08T22:47:36.279 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_misplaced": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_unfound": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_dirty": 301, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_whiteouts": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_read": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_read_kb": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_write": 301, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_write_kb": 1204, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_scrub_errors": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_shallow_scrub_errors": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_deep_scrub_errors": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_recovered": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes_recovered": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_keys_recovered": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_omap": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_hit_set_archive": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes_hit_set_archive": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_kb": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_kb": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_promote": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_mode_high": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_mode_low": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_mode_some": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_mode_full": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_pinned": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_legacy_snapsets": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_large_omap_objects": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_manifest": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_omap_bytes": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_omap_keys": 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_repaired": 0 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "up": [ 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: 2147483647 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "acting": [ 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: 2147483647 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "avail_no_missing": [ 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "3(0)", 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "0(3)", 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "1(1)", 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "4(2)" 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "object_location_counts": [ 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "shards": "0(3),1(1),3(0),4(2)", 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: "objects": 301 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:47:36.280 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "blocked_by": [], 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "up_primary": 3, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "acting_primary": 3, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "purged_snaps": [] 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "empty": 0, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "dne": 0, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "incomplete": 0, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_started": 55, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "hit_set_history": { 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "current_last_update": "0'0", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "history": [] 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "peer": "1(1)", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "pgid": "2.0s1", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "shared": "1", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_update": "47'301", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_complete": "47'301", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "log_tail": "47'200", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_user_version": 301, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_backfill": "MAX", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "purged_snaps": [], 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "history": { 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "epoch_created": 41, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "epoch_pool_created": 41, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_started": 55, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_interval_started": 54, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_clean": 42, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_interval_clean": 41, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_split": 0, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_marked_full": 0, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "same_up_since": 54, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "same_interval_since": 54, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "same_primary_since": 41, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub": "0'0", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub": "0'0", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "prior_readable_until_ub": 0 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "stats": { 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "version": "47'301", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "reported_seq": 329, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "reported_epoch": 47, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "state": "active+undersized+degraded", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_fresh": "2026-03-08T22:47:10.077221+0000", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_change": "2026-03-08T22:46:59.277911+0000", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_active": "2026-03-08T22:47:10.077221+0000", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_peered": "2026-03-08T22:47:10.077221+0000", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean": "2026-03-08T22:46:58.630673+0000", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_became_active": "2026-03-08T22:46:59.277911+0000", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_became_peered": "2026-03-08T22:46:59.277911+0000", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_unstale": "2026-03-08T22:47:10.077221+0000", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_undegraded": "2026-03-08T22:46:59.269575+0000", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_fullsized": "2026-03-08T22:46:59.269487+0000", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "mapping_epoch": 54, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "log_start": "47'200", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "ondisk_log_start": "47'200", 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "created": 41, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_clean": 42, 2026-03-08T22:47:36.281 INFO:tasks.workunit.client.0.vm04.stdout: "parent": "0.0", 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "parent_split_bits": 0, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub": "0'0", 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub": "0'0", 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "objects_scrubbed": 0, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "log_size": 101, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "log_dups_size": 200, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "ondisk_log_size": 101, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "stats_invalid": false, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "dirty_stats_invalid": false, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "omap_stats_invalid": false, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "hitset_stats_invalid": false, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "hitset_bytes_stats_invalid": false, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "pin_stats_invalid": false, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "manifest_stats_invalid": false, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "snaptrimq_len": 0, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_duration": 0, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "scrub_schedule": "periodic scrub scheduled @ 2026-03-10T08:00:21.176151+0000", 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "scrub_duration": 0, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "objects_trimmed": 0, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "snaptrim_duration": 0, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "stat_sum": { 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes": 1232896, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects": 301, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "num_object_clones": 0, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "num_object_copies": 1505, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_missing_on_primary": 0, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_missing": 1, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_degraded": 301, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_misplaced": 0, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_unfound": 0, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_dirty": 301, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "num_whiteouts": 0, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "num_read": 0, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "num_read_kb": 0, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "num_write": 301, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "num_write_kb": 1204, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "num_scrub_errors": 0, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "num_shallow_scrub_errors": 0, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "num_deep_scrub_errors": 0, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_recovered": 0, 2026-03-08T22:47:36.282 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes_recovered": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "num_keys_recovered": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_omap": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_hit_set_archive": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes_hit_set_archive": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_kb": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_kb": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "num_promote": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_mode_high": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_mode_low": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_mode_some": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_mode_full": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_pinned": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "num_legacy_snapsets": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "num_large_omap_objects": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_manifest": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "num_omap_bytes": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "num_omap_keys": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_repaired": 0 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "up": [ 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: 2147483647 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "acting": [ 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: 2147483647 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "avail_no_missing": [ 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "3(0)", 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "0(3)", 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "1(1)", 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "4(2)" 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "object_location_counts": [ 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "shards": "0(3),1(1),3(0),4(2)", 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "objects": 301 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "blocked_by": [], 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "up_primary": 3, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "acting_primary": 3, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "purged_snaps": [] 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "empty": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "dne": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "incomplete": 0, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_started": 55, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "hit_set_history": { 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "current_last_update": "0'0", 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: "history": [] 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.283 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "peer": "4(2)", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "pgid": "2.0s2", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "shared": "2", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_update": "47'301", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_complete": "47'301", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "log_tail": "47'200", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_user_version": 301, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_backfill": "MAX", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "purged_snaps": [], 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "history": { 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "epoch_created": 41, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "epoch_pool_created": 41, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_started": 55, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_interval_started": 54, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_clean": 42, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_interval_clean": 41, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_split": 0, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_marked_full": 0, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "same_up_since": 54, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "same_interval_since": 54, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "same_primary_since": 41, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub": "0'0", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub": "0'0", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "prior_readable_until_ub": 0 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "stats": { 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "version": "47'301", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "reported_seq": 329, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "reported_epoch": 47, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "state": "active+undersized+degraded", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_fresh": "2026-03-08T22:47:10.077221+0000", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_change": "2026-03-08T22:46:59.277911+0000", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_active": "2026-03-08T22:47:10.077221+0000", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_peered": "2026-03-08T22:47:10.077221+0000", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean": "2026-03-08T22:46:58.630673+0000", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_became_active": "2026-03-08T22:46:59.277911+0000", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_became_peered": "2026-03-08T22:46:59.277911+0000", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_unstale": "2026-03-08T22:47:10.077221+0000", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_undegraded": "2026-03-08T22:46:59.269575+0000", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_fullsized": "2026-03-08T22:46:59.269487+0000", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "mapping_epoch": 54, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "log_start": "47'200", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "ondisk_log_start": "47'200", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "created": 41, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_clean": 42, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "parent": "0.0", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "parent_split_bits": 0, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub": "0'0", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub": "0'0", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean_scrub_stamp": "2026-03-08T22:46:53.389027+0000", 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "objects_scrubbed": 0, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "log_size": 101, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "log_dups_size": 200, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "ondisk_log_size": 101, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "stats_invalid": false, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "dirty_stats_invalid": false, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "omap_stats_invalid": false, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "hitset_stats_invalid": false, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "hitset_bytes_stats_invalid": false, 2026-03-08T22:47:36.284 INFO:tasks.workunit.client.0.vm04.stdout: "pin_stats_invalid": false, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "manifest_stats_invalid": false, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "snaptrimq_len": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_duration": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "scrub_schedule": "periodic scrub scheduled @ 2026-03-10T08:00:21.176151+0000", 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "scrub_duration": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "objects_trimmed": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "snaptrim_duration": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "stat_sum": { 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes": 1232896, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects": 301, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_object_clones": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_object_copies": 1505, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_missing_on_primary": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_missing": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_degraded": 301, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_misplaced": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_unfound": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_dirty": 301, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_whiteouts": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_read": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_read_kb": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_write": 301, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_write_kb": 1204, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_scrub_errors": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_shallow_scrub_errors": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_deep_scrub_errors": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_recovered": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes_recovered": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_keys_recovered": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_omap": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_hit_set_archive": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes_hit_set_archive": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_kb": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_kb": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_promote": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_mode_high": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_mode_low": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_mode_some": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_mode_full": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_pinned": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_legacy_snapsets": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_large_omap_objects": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_manifest": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_omap_bytes": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_omap_keys": 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_repaired": 0 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "up": [ 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: 2147483647 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "acting": [ 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: 2147483647 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "avail_no_missing": [ 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "3(0)", 2026-03-08T22:47:36.285 INFO:tasks.workunit.client.0.vm04.stdout: "0(3)", 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "1(1)", 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "4(2)" 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "object_location_counts": [ 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "shards": "0(3),1(1),3(0),4(2)", 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "objects": 301 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "blocked_by": [], 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "up_primary": 3, 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "acting_primary": 3, 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "purged_snaps": [] 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "empty": 0, 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "dne": 0, 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "incomplete": 0, 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_started": 55, 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "hit_set_history": { 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "current_last_update": "0'0", 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "history": [] 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "recovery_state": [ 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "name": "Started/Primary/Active", 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "enter_time": "2026-03-08T22:47:30.867238+0000", 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "might_have_unfound": [ 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "osd": "0(3)", 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "status": "already probed" 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "osd": "1(1)", 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "status": "already probed" 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "osd": "2(4)", 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "status": "osd is down" 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "osd": "4(2)", 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "status": "already probed" 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "recovery_progress": { 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "backfill_targets": [], 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "waiting_on_backfill": [], 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "last_backfill_started": "MIN", 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "backfill_info": { 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "begin": "MIN", 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "end": "MIN", 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "objects": [] 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "peer_backfill_info": [], 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "backfills_in_flight": [], 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "recovering": [], 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "pg_backend": { 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "recovery_ops": [], 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: "read_ops": [] 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.286 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:47:36.287 INFO:tasks.workunit.client.0.vm04.stdout: "name": "Started", 2026-03-08T22:47:36.287 INFO:tasks.workunit.client.0.vm04.stdout: "enter_time": "2026-03-08T22:47:30.745082+0000" 2026-03-08T22:47:36.287 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:47:36.287 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:47:36.287 INFO:tasks.workunit.client.0.vm04.stdout: "scrubber": { 2026-03-08T22:47:36.287 INFO:tasks.workunit.client.0.vm04.stdout: "active": false, 2026-03-08T22:47:36.287 INFO:tasks.workunit.client.0.vm04.stdout: "must_scrub": false, 2026-03-08T22:47:36.287 INFO:tasks.workunit.client.0.vm04.stdout: "must_deep_scrub": false, 2026-03-08T22:47:36.287 INFO:tasks.workunit.client.0.vm04.stdout: "must_repair": false, 2026-03-08T22:47:36.287 INFO:tasks.workunit.client.0.vm04.stdout: "need_auto": false, 2026-03-08T22:47:36.287 INFO:tasks.workunit.client.0.vm04.stdout: "scrub_reg_stamp": "2026-03-10T08:00:21.176151+0000", 2026-03-08T22:47:36.287 INFO:tasks.workunit.client.0.vm04.stdout: "schedule": "no scrub is scheduled" 2026-03-08T22:47:36.287 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:47:36.287 INFO:tasks.workunit.client.0.vm04.stdout: "agent_state": {} 2026-03-08T22:47:36.287 INFO:tasks.workunit.client.0.vm04.stdout:} 2026-03-08T22:47:36.287 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:570: TEST_ec_backfill_unfound: ceph pg 2.0 list_unfound 2026-03-08T22:47:36.287 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:570: TEST_ec_backfill_unfound: grep -q obj250 2026-03-08T22:47:36.355 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:572: TEST_ec_backfill_unfound: ceph pg 2.0 list_unfound 2026-03-08T22:47:36.355 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:572: TEST_ec_backfill_unfound: jq .available_might_have_unfound 2026-03-08T22:47:36.424 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:572: TEST_ec_backfill_unfound: check=true 2026-03-08T22:47:36.424 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:573: TEST_ec_backfill_unfound: test true == true 2026-03-08T22:47:36.424 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:575: TEST_ec_backfill_unfound: ceph pg 2.0 list_unfound 2026-03-08T22:47:36.425 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:575: TEST_ec_backfill_unfound: jq '.might_have_unfound[0].status' 2026-03-08T22:47:36.497 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:575: TEST_ec_backfill_unfound: eval 'check="osd' is 'down"' 2026-03-08T22:47:36.497 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:575: TEST_ec_backfill_unfound: check='osd is down' 2026-03-08T22:47:36.497 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:576: TEST_ec_backfill_unfound: test 'osd is down' == 'osd is down' 2026-03-08T22:47:36.497 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:578: TEST_ec_backfill_unfound: ceph pg 2.0 list_unfound 2026-03-08T22:47:36.497 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:578: TEST_ec_backfill_unfound: jq '.might_have_unfound[0].osd' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:578: TEST_ec_backfill_unfound: eval 'check="2(4)"' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:578: TEST_ec_backfill_unfound: check='2(4)' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:579: TEST_ec_backfill_unfound: test '2(4)' == '2(4)' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:581: TEST_ec_backfill_unfound: activate_osd td/test-erasure-eio 2 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:47:36.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T22:47:36.573 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T22:47:36.573 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T22:47:36.573 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:47:36.573 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/2/whoami 2026-03-08T22:47:36.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T22:47:36.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:47:36.575 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:47:36.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:47:36.592 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:47:36.592+0000 7f6feff19780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:36.601 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:47:36.602+0000 7f6feff19780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:36.603 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:47:36.603+0000 7f6feff19780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:36.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T22:47:36.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:47:36.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:47:36.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:47:36.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:47:36.804 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:47:36.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:36.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:47:36.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:36.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:47:37.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:37.411 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:47:37.411+0000 7f6feff19780 -1 Falling back to public interface 2026-03-08T22:47:38.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:38.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:38.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:47:38.019 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:47:38.020 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:47:38.020 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:38.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:38.356 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:47:38.357+0000 7f6feff19780 -1 osd.2 53 log_to_monitors true 2026-03-08T22:47:39.230 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:47:39.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:39.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:39.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:47:39.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:39.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:47:39.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:39.761 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:47:39.762+0000 7f6fe6eb1640 -1 osd.2 53 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:47:40.469 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:47:40.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:40.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:40.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:47:40.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:40.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:47:40.693 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up in weight 1 up_from 57 up_thru 19 down_at 54 last_clean_interval [49,53) [v2:127.0.0.1:6818/4223821539,v1:127.0.0.1:6819/4223821539] [v2:127.0.0.1:6820/4223821539,v1:127.0.0.1:6821/4223821539] exists,up 9d09b8f7-7eb9-43b3-b8be-eb3ac2ab9132 2026-03-08T22:47:40.693 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:47:40.693 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:47:40.694 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:47:40.694 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:584: TEST_ec_backfill_unfound: timeout 5 rados -p pool-jerasure get obj250 td/test-erasure-eio/CHECK 2026-03-08T22:47:45.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:585: TEST_ec_backfill_unfound: test 124 = 124 2026-03-08T22:47:45.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:587: TEST_ec_backfill_unfound: ceph pg 2.0 mark_unfound_lost delete 2026-03-08T22:47:45.772 INFO:tasks.workunit.client.0.vm04.stderr:pg has no unfound objects 2026-03-08T22:47:45.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:589: TEST_ec_backfill_unfound: wait_for_clean 2026-03-08T22:47:45.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:47:45.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:47:45.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:47:45.789 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:47:45.789 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:47:45.789 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:47:45.789 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:47:45.790 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:47:45.790 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:47:45.870 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:47:45.871 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:47:45.871 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:47:45.871 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:47:45.871 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:47:45.871 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:47:46.105 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:47:46.105 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:47:46.105 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:47:46.105 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:47:46.105 INFO:tasks.workunit.client.0.vm04.stderr:4' 2026-03-08T22:47:46.105 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:47:46.105 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:47:46.105 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:47:46.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803797 2026-03-08T22:47:46.179 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803797 2026-03-08T22:47:46.179 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803797' 2026-03-08T22:47:46.179 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:47:46.179 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:47:46.248 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574867 2026-03-08T22:47:46.248 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574867 2026-03-08T22:47:46.248 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803797 1-55834574867' 2026-03-08T22:47:46.248 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:47:46.248 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:47:46.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=244813135876 2026-03-08T22:47:46.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 244813135876 2026-03-08T22:47:46.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803797 1-55834574867 2-244813135876' 2026-03-08T22:47:46.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:47:46.316 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:47:46.389 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964117010 2026-03-08T22:47:46.389 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964117010 2026-03-08T22:47:46.389 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803797 1-55834574867 2-244813135876 3-115964117010' 2026-03-08T22:47:46.389 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:47:46.389 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:47:46.464 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855375 2026-03-08T22:47:46.464 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855375 2026-03-08T22:47:46.464 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803797 1-55834574867 2-244813135876 3-115964117010 4-150323855375' 2026-03-08T22:47:46.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:47:46.465 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803797 2026-03-08T22:47:46.465 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:47:46.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:47:46.466 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803797 2026-03-08T22:47:46.466 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:47:46.467 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803797 2026-03-08T22:47:46.467 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803797' 2026-03-08T22:47:46.467 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803797 2026-03-08T22:47:46.467 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:47:46.683 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803797 -lt 25769803797 2026-03-08T22:47:46.683 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:47:46.683 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574867 2026-03-08T22:47:46.683 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:47:46.684 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:47:46.684 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574867 2026-03-08T22:47:46.684 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:47:46.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574867 2026-03-08T22:47:46.685 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 55834574867 2026-03-08T22:47:46.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574867' 2026-03-08T22:47:46.685 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:47:46.898 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574868 -lt 55834574867 2026-03-08T22:47:46.898 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:47:46.898 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-244813135876 2026-03-08T22:47:46.898 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:47:46.899 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:47:46.899 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-244813135876 2026-03-08T22:47:46.900 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:47:46.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=244813135876 2026-03-08T22:47:46.900 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 244813135876 2026-03-08T22:47:46.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 244813135876' 2026-03-08T22:47:46.900 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:47:47.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 244813135876 -lt 244813135876 2026-03-08T22:47:47.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:47:47.103 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964117010 2026-03-08T22:47:47.103 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:47:47.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:47:47.104 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964117010 2026-03-08T22:47:47.104 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:47:47.105 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964117010 2026-03-08T22:47:47.105 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 115964117010 2026-03-08T22:47:47.105 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964117010' 2026-03-08T22:47:47.105 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:47:47.313 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964117010 -lt 115964117010 2026-03-08T22:47:47.313 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:47:47.313 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-150323855375 2026-03-08T22:47:47.313 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:47:47.314 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:47:47.314 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-150323855375 2026-03-08T22:47:47.314 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:47:47.315 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855375 2026-03-08T22:47:47.315 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.4 seq 150323855375 2026-03-08T22:47:47.315 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 150323855375' 2026-03-08T22:47:47.315 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:47:47.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855375 -lt 150323855375 2026-03-08T22:47:47.521 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:47:47.521 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:47:47.521 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:47:47.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:47:47.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:47:47.794 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:47:47.794 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:47:47.794 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:47:47.794 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:47:47.795 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:47:47.795 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:47:48.001 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:47:48.001 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:47:48.001 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:47:48.001 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:47:48.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:47:48.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:47:48.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:47:48.285 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: seq 1 300 2026-03-08T22:47:48.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj1 = obj250 ']' 2026-03-08T22:47:48.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj1 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.311 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.312 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.312 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj2 = obj250 ']' 2026-03-08T22:47:48.312 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj2 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.335 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.336 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.336 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj3 = obj250 ']' 2026-03-08T22:47:48.336 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj3 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.358 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.359 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.359 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj4 = obj250 ']' 2026-03-08T22:47:48.359 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj4 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.384 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.384 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj5 = obj250 ']' 2026-03-08T22:47:48.384 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj5 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj6 = obj250 ']' 2026-03-08T22:47:48.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj6 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj7 = obj250 ']' 2026-03-08T22:47:48.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj7 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.451 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj8 = obj250 ']' 2026-03-08T22:47:48.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj8 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.474 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj9 = obj250 ']' 2026-03-08T22:47:48.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj9 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.509 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.510 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.510 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj10 = obj250 ']' 2026-03-08T22:47:48.510 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj10 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.532 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.532 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.533 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj11 = obj250 ']' 2026-03-08T22:47:48.533 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj11 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj12 = obj250 ']' 2026-03-08T22:47:48.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj12 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.580 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj13 = obj250 ']' 2026-03-08T22:47:48.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj13 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.607 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.608 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.608 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj14 = obj250 ']' 2026-03-08T22:47:48.608 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj14 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj15 = obj250 ']' 2026-03-08T22:47:48.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj15 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj16 = obj250 ']' 2026-03-08T22:47:48.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj16 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.680 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.680 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj17 = obj250 ']' 2026-03-08T22:47:48.680 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj17 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.703 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.704 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.704 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj18 = obj250 ']' 2026-03-08T22:47:48.704 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj18 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.726 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.727 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.727 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj19 = obj250 ']' 2026-03-08T22:47:48.727 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj19 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj20 = obj250 ']' 2026-03-08T22:47:48.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj20 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj21 = obj250 ']' 2026-03-08T22:47:48.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj21 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.799 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.800 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.800 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj22 = obj250 ']' 2026-03-08T22:47:48.800 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj22 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj23 = obj250 ']' 2026-03-08T22:47:48.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj23 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj24 = obj250 ']' 2026-03-08T22:47:48.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj24 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.876 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.876 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj25 = obj250 ']' 2026-03-08T22:47:48.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj25 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj26 = obj250 ']' 2026-03-08T22:47:48.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj26 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj27 = obj250 ']' 2026-03-08T22:47:48.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj27 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.947 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj28 = obj250 ']' 2026-03-08T22:47:48.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj28 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.971 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.971 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj29 = obj250 ']' 2026-03-08T22:47:48.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj29 td/test-erasure-eio/CHECK 2026-03-08T22:47:48.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:48.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:48.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj30 = obj250 ']' 2026-03-08T22:47:48.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj30 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj31 = obj250 ']' 2026-03-08T22:47:49.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj31 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.046 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.046 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj32 = obj250 ']' 2026-03-08T22:47:49.046 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj32 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.073 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj33 = obj250 ']' 2026-03-08T22:47:49.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj33 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj34 = obj250 ']' 2026-03-08T22:47:49.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj34 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.125 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj35 = obj250 ']' 2026-03-08T22:47:49.125 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj35 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.150 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.150 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj36 = obj250 ']' 2026-03-08T22:47:49.150 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj36 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj37 = obj250 ']' 2026-03-08T22:47:49.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj37 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.206 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.207 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.207 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj38 = obj250 ']' 2026-03-08T22:47:49.207 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj38 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.232 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.232 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.232 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj39 = obj250 ']' 2026-03-08T22:47:49.232 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj39 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.257 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.257 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj40 = obj250 ']' 2026-03-08T22:47:49.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj40 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj41 = obj250 ']' 2026-03-08T22:47:49.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj41 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.313 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.314 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.314 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj42 = obj250 ']' 2026-03-08T22:47:49.314 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj42 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.341 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj43 = obj250 ']' 2026-03-08T22:47:49.341 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj43 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj44 = obj250 ']' 2026-03-08T22:47:49.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj44 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.392 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.392 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj45 = obj250 ']' 2026-03-08T22:47:49.392 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj45 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj46 = obj250 ']' 2026-03-08T22:47:49.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj46 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.442 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.442 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.442 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj47 = obj250 ']' 2026-03-08T22:47:49.442 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj47 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.468 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj48 = obj250 ']' 2026-03-08T22:47:49.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj48 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.494 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj49 = obj250 ']' 2026-03-08T22:47:49.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj49 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.521 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.521 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj50 = obj250 ']' 2026-03-08T22:47:49.521 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj50 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.546 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.546 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj51 = obj250 ']' 2026-03-08T22:47:49.546 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj51 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.569 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.570 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.570 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj52 = obj250 ']' 2026-03-08T22:47:49.570 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj52 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj53 = obj250 ']' 2026-03-08T22:47:49.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj53 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.617 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.618 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.618 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj54 = obj250 ']' 2026-03-08T22:47:49.618 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj54 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.642 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.643 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.643 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj55 = obj250 ']' 2026-03-08T22:47:49.643 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj55 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.665 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj56 = obj250 ']' 2026-03-08T22:47:49.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj56 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.693 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.693 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj57 = obj250 ']' 2026-03-08T22:47:49.693 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj57 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj58 = obj250 ']' 2026-03-08T22:47:49.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj58 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.744 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.744 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.744 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj59 = obj250 ']' 2026-03-08T22:47:49.744 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj59 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj60 = obj250 ']' 2026-03-08T22:47:49.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj60 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj61 = obj250 ']' 2026-03-08T22:47:49.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj61 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj62 = obj250 ']' 2026-03-08T22:47:49.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj62 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.836 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.837 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.837 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj63 = obj250 ']' 2026-03-08T22:47:49.837 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj63 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.858 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.859 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.859 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj64 = obj250 ']' 2026-03-08T22:47:49.859 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj64 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj65 = obj250 ']' 2026-03-08T22:47:49.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj65 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.905 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj66 = obj250 ']' 2026-03-08T22:47:49.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj66 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj67 = obj250 ']' 2026-03-08T22:47:49.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj67 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.953 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.953 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj68 = obj250 ']' 2026-03-08T22:47:49.953 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj68 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj69 = obj250 ']' 2026-03-08T22:47:49.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj69 td/test-erasure-eio/CHECK 2026-03-08T22:47:49.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:49.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:49.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj70 = obj250 ']' 2026-03-08T22:47:49.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj70 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.025 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.026 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.026 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj71 = obj250 ']' 2026-03-08T22:47:50.026 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj71 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.048 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj72 = obj250 ']' 2026-03-08T22:47:50.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj72 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj73 = obj250 ']' 2026-03-08T22:47:50.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj73 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj74 = obj250 ']' 2026-03-08T22:47:50.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj74 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj75 = obj250 ']' 2026-03-08T22:47:50.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj75 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.139 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.140 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.140 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj76 = obj250 ']' 2026-03-08T22:47:50.140 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj76 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj77 = obj250 ']' 2026-03-08T22:47:50.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj77 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.187 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.187 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj78 = obj250 ']' 2026-03-08T22:47:50.187 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj78 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.209 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.210 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.210 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj79 = obj250 ']' 2026-03-08T22:47:50.210 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj79 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj80 = obj250 ']' 2026-03-08T22:47:50.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj80 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.257 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj81 = obj250 ']' 2026-03-08T22:47:50.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj81 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.281 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj82 = obj250 ']' 2026-03-08T22:47:50.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj82 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.304 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.305 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.305 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj83 = obj250 ']' 2026-03-08T22:47:50.305 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj83 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj84 = obj250 ']' 2026-03-08T22:47:50.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj84 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.351 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.351 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.351 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj85 = obj250 ']' 2026-03-08T22:47:50.351 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj85 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj86 = obj250 ']' 2026-03-08T22:47:50.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj86 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj87 = obj250 ']' 2026-03-08T22:47:50.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj87 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.420 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.421 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.421 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj88 = obj250 ']' 2026-03-08T22:47:50.421 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj88 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.443 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.444 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.444 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj89 = obj250 ']' 2026-03-08T22:47:50.444 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj89 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.467 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.467 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj90 = obj250 ']' 2026-03-08T22:47:50.467 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj90 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.488 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.489 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.489 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj91 = obj250 ']' 2026-03-08T22:47:50.489 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj91 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.510 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj92 = obj250 ']' 2026-03-08T22:47:50.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj92 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.532 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.533 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.533 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj93 = obj250 ']' 2026-03-08T22:47:50.533 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj93 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj94 = obj250 ']' 2026-03-08T22:47:50.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj94 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.580 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.580 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj95 = obj250 ']' 2026-03-08T22:47:50.580 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj95 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj96 = obj250 ']' 2026-03-08T22:47:50.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj96 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj97 = obj250 ']' 2026-03-08T22:47:50.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj97 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj98 = obj250 ']' 2026-03-08T22:47:50.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj98 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.670 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.670 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj99 = obj250 ']' 2026-03-08T22:47:50.670 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj99 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.694 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj100 = obj250 ']' 2026-03-08T22:47:50.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj100 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj101 = obj250 ']' 2026-03-08T22:47:50.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj101 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.742 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.743 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.743 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj102 = obj250 ']' 2026-03-08T22:47:50.743 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj102 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.764 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj103 = obj250 ']' 2026-03-08T22:47:50.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj103 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.786 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.787 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.787 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj104 = obj250 ']' 2026-03-08T22:47:50.787 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj104 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj105 = obj250 ']' 2026-03-08T22:47:50.809 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj105 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.831 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj106 = obj250 ']' 2026-03-08T22:47:50.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj106 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj107 = obj250 ']' 2026-03-08T22:47:50.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj107 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.878 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj108 = obj250 ']' 2026-03-08T22:47:50.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj108 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.901 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj109 = obj250 ']' 2026-03-08T22:47:50.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj109 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj110 = obj250 ']' 2026-03-08T22:47:50.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj110 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj111 = obj250 ']' 2026-03-08T22:47:50.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj111 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.971 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj112 = obj250 ']' 2026-03-08T22:47:50.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj112 td/test-erasure-eio/CHECK 2026-03-08T22:47:50.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:50.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:50.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj113 = obj250 ']' 2026-03-08T22:47:50.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj113 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.022 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.023 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.023 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj114 = obj250 ']' 2026-03-08T22:47:51.023 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj114 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj115 = obj250 ']' 2026-03-08T22:47:51.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj115 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.073 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj116 = obj250 ']' 2026-03-08T22:47:51.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj116 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj117 = obj250 ']' 2026-03-08T22:47:51.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj117 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.122 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.122 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.122 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj118 = obj250 ']' 2026-03-08T22:47:51.122 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj118 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.144 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj119 = obj250 ']' 2026-03-08T22:47:51.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj119 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj120 = obj250 ']' 2026-03-08T22:47:51.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj120 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.191 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.191 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.192 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj121 = obj250 ']' 2026-03-08T22:47:51.192 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj121 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj122 = obj250 ']' 2026-03-08T22:47:51.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj122 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.238 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.238 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj123 = obj250 ']' 2026-03-08T22:47:51.238 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj123 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.264 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.265 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.265 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj124 = obj250 ']' 2026-03-08T22:47:51.265 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj124 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.287 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj125 = obj250 ']' 2026-03-08T22:47:51.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj125 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.311 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.311 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj126 = obj250 ']' 2026-03-08T22:47:51.311 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj126 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.334 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.335 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.335 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj127 = obj250 ']' 2026-03-08T22:47:51.335 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj127 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.358 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.359 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.359 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj128 = obj250 ']' 2026-03-08T22:47:51.359 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj128 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.386 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.388 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.388 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj129 = obj250 ']' 2026-03-08T22:47:51.388 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj129 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.410 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.411 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.411 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj130 = obj250 ']' 2026-03-08T22:47:51.411 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj130 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.433 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj131 = obj250 ']' 2026-03-08T22:47:51.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj131 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.457 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.457 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj132 = obj250 ']' 2026-03-08T22:47:51.457 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj132 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.481 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.481 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.481 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj133 = obj250 ']' 2026-03-08T22:47:51.481 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj133 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.503 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.503 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.503 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj134 = obj250 ']' 2026-03-08T22:47:51.503 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj134 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.525 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.525 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.525 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj135 = obj250 ']' 2026-03-08T22:47:51.525 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj135 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.550 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj136 = obj250 ']' 2026-03-08T22:47:51.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj136 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.575 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.576 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.576 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj137 = obj250 ']' 2026-03-08T22:47:51.576 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj137 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj138 = obj250 ']' 2026-03-08T22:47:51.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj138 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.622 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.622 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.622 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj139 = obj250 ']' 2026-03-08T22:47:51.623 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj139 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.644 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.645 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.645 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj140 = obj250 ']' 2026-03-08T22:47:51.645 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj140 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.667 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj141 = obj250 ']' 2026-03-08T22:47:51.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj141 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.691 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj142 = obj250 ']' 2026-03-08T22:47:51.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj142 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.718 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj143 = obj250 ']' 2026-03-08T22:47:51.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj143 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.742 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.742 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.742 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj144 = obj250 ']' 2026-03-08T22:47:51.742 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj144 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.764 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.764 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj145 = obj250 ']' 2026-03-08T22:47:51.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj145 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj146 = obj250 ']' 2026-03-08T22:47:51.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj146 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.806 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.806 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj147 = obj250 ']' 2026-03-08T22:47:51.806 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj147 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj148 = obj250 ']' 2026-03-08T22:47:51.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj148 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj149 = obj250 ']' 2026-03-08T22:47:51.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj149 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.871 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj150 = obj250 ']' 2026-03-08T22:47:51.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj150 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj151 = obj250 ']' 2026-03-08T22:47:51.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj151 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.970 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.971 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.971 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj152 = obj250 ']' 2026-03-08T22:47:51.971 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj152 td/test-erasure-eio/CHECK 2026-03-08T22:47:51.991 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:51.992 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:51.992 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj153 = obj250 ']' 2026-03-08T22:47:51.992 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj153 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj154 = obj250 ']' 2026-03-08T22:47:52.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj154 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj155 = obj250 ']' 2026-03-08T22:47:52.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj155 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj156 = obj250 ']' 2026-03-08T22:47:52.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj156 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.077 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj157 = obj250 ']' 2026-03-08T22:47:52.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj157 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj158 = obj250 ']' 2026-03-08T22:47:52.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj158 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.120 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.120 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.120 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj159 = obj250 ']' 2026-03-08T22:47:52.120 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj159 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.141 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.142 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.142 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj160 = obj250 ']' 2026-03-08T22:47:52.142 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj160 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.162 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj161 = obj250 ']' 2026-03-08T22:47:52.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj161 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj162 = obj250 ']' 2026-03-08T22:47:52.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj162 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.207 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.207 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.207 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj163 = obj250 ']' 2026-03-08T22:47:52.207 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj163 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj164 = obj250 ']' 2026-03-08T22:47:52.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj164 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.250 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.250 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.250 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj165 = obj250 ']' 2026-03-08T22:47:52.250 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj165 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.272 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.272 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj166 = obj250 ']' 2026-03-08T22:47:52.272 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj166 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.293 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.293 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.293 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj167 = obj250 ']' 2026-03-08T22:47:52.293 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj167 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj168 = obj250 ']' 2026-03-08T22:47:52.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj168 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.338 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj169 = obj250 ']' 2026-03-08T22:47:52.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj169 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.360 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj170 = obj250 ']' 2026-03-08T22:47:52.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj170 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.382 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj171 = obj250 ']' 2026-03-08T22:47:52.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj171 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj172 = obj250 ']' 2026-03-08T22:47:52.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj172 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj173 = obj250 ']' 2026-03-08T22:47:52.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj173 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.455 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj174 = obj250 ']' 2026-03-08T22:47:52.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj174 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.479 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.480 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.480 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj175 = obj250 ']' 2026-03-08T22:47:52.480 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj175 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.503 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.504 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.504 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj176 = obj250 ']' 2026-03-08T22:47:52.504 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj176 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.528 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.528 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj177 = obj250 ']' 2026-03-08T22:47:52.528 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj177 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.552 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.552 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.552 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj178 = obj250 ']' 2026-03-08T22:47:52.552 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj178 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.578 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj179 = obj250 ']' 2026-03-08T22:47:52.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj179 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.607 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.608 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.608 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj180 = obj250 ']' 2026-03-08T22:47:52.608 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj180 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.633 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.634 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.634 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj181 = obj250 ']' 2026-03-08T22:47:52.634 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj181 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj182 = obj250 ']' 2026-03-08T22:47:52.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj182 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj183 = obj250 ']' 2026-03-08T22:47:52.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj183 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj184 = obj250 ']' 2026-03-08T22:47:52.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj184 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.741 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.742 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.742 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj185 = obj250 ']' 2026-03-08T22:47:52.742 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj185 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj186 = obj250 ']' 2026-03-08T22:47:52.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj186 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.802 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj187 = obj250 ']' 2026-03-08T22:47:52.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj187 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.827 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj188 = obj250 ']' 2026-03-08T22:47:52.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj188 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj189 = obj250 ']' 2026-03-08T22:47:52.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj189 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.878 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj190 = obj250 ']' 2026-03-08T22:47:52.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj190 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj191 = obj250 ']' 2026-03-08T22:47:52.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj191 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj192 = obj250 ']' 2026-03-08T22:47:52.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj192 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.959 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj193 = obj250 ']' 2026-03-08T22:47:52.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj193 td/test-erasure-eio/CHECK 2026-03-08T22:47:52.983 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:52.984 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:52.984 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj194 = obj250 ']' 2026-03-08T22:47:52.984 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj194 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj195 = obj250 ']' 2026-03-08T22:47:53.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj195 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj196 = obj250 ']' 2026-03-08T22:47:53.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj196 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj197 = obj250 ']' 2026-03-08T22:47:53.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj197 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj198 = obj250 ']' 2026-03-08T22:47:53.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj198 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.102 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj199 = obj250 ']' 2026-03-08T22:47:53.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj199 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj200 = obj250 ']' 2026-03-08T22:47:53.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj200 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.150 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.150 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj201 = obj250 ']' 2026-03-08T22:47:53.150 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj201 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.172 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.172 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.173 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj202 = obj250 ']' 2026-03-08T22:47:53.173 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj202 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.198 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.198 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj203 = obj250 ']' 2026-03-08T22:47:53.198 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj203 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.221 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.222 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.222 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj204 = obj250 ']' 2026-03-08T22:47:53.222 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj204 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.245 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.245 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj205 = obj250 ']' 2026-03-08T22:47:53.245 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj205 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.266 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj206 = obj250 ']' 2026-03-08T22:47:53.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj206 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.289 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj207 = obj250 ']' 2026-03-08T22:47:53.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj207 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.313 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.314 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.314 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj208 = obj250 ']' 2026-03-08T22:47:53.314 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj208 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.337 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.338 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.338 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj209 = obj250 ']' 2026-03-08T22:47:53.338 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj209 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj210 = obj250 ']' 2026-03-08T22:47:53.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj210 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.384 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.384 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj211 = obj250 ']' 2026-03-08T22:47:53.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj211 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj212 = obj250 ']' 2026-03-08T22:47:53.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj212 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj213 = obj250 ']' 2026-03-08T22:47:53.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj213 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj214 = obj250 ']' 2026-03-08T22:47:53.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj214 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.471 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.472 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.472 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj215 = obj250 ']' 2026-03-08T22:47:53.472 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj215 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.494 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.494 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj216 = obj250 ']' 2026-03-08T22:47:53.494 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj216 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.515 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.516 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.516 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj217 = obj250 ']' 2026-03-08T22:47:53.516 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj217 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.539 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj218 = obj250 ']' 2026-03-08T22:47:53.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj218 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.564 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj219 = obj250 ']' 2026-03-08T22:47:53.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj219 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.588 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.588 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.588 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj220 = obj250 ']' 2026-03-08T22:47:53.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj220 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.610 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.611 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.611 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj221 = obj250 ']' 2026-03-08T22:47:53.611 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj221 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj222 = obj250 ']' 2026-03-08T22:47:53.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj222 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.657 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.658 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.658 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj223 = obj250 ']' 2026-03-08T22:47:53.658 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj223 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj224 = obj250 ']' 2026-03-08T22:47:53.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj224 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.702 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.703 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.703 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj225 = obj250 ']' 2026-03-08T22:47:53.703 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj225 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj226 = obj250 ']' 2026-03-08T22:47:53.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj226 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.746 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.747 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.747 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj227 = obj250 ']' 2026-03-08T22:47:53.747 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj227 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.768 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.768 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj228 = obj250 ']' 2026-03-08T22:47:53.768 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj228 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj229 = obj250 ']' 2026-03-08T22:47:53.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj229 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.814 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.815 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.815 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj230 = obj250 ']' 2026-03-08T22:47:53.815 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj230 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.838 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.839 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.839 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj231 = obj250 ']' 2026-03-08T22:47:53.839 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj231 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.864 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.864 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj232 = obj250 ']' 2026-03-08T22:47:53.864 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj232 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.888 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.888 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.888 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj233 = obj250 ']' 2026-03-08T22:47:53.888 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj233 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.911 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.912 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.912 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj234 = obj250 ']' 2026-03-08T22:47:53.912 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj234 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj235 = obj250 ']' 2026-03-08T22:47:53.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj235 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.957 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.958 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.958 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj236 = obj250 ']' 2026-03-08T22:47:53.958 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj236 td/test-erasure-eio/CHECK 2026-03-08T22:47:53.983 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:53.984 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:53.984 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj237 = obj250 ']' 2026-03-08T22:47:53.984 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj237 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj238 = obj250 ']' 2026-03-08T22:47:54.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj238 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj239 = obj250 ']' 2026-03-08T22:47:54.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj239 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj240 = obj250 ']' 2026-03-08T22:47:54.061 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj240 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.085 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.085 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj241 = obj250 ']' 2026-03-08T22:47:54.085 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj241 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.109 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj242 = obj250 ']' 2026-03-08T22:47:54.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj242 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.134 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.135 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.135 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj243 = obj250 ']' 2026-03-08T22:47:54.135 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj243 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj244 = obj250 ']' 2026-03-08T22:47:54.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj244 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.182 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.183 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.183 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj245 = obj250 ']' 2026-03-08T22:47:54.183 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj245 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.209 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.210 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.210 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj246 = obj250 ']' 2026-03-08T22:47:54.210 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj246 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj247 = obj250 ']' 2026-03-08T22:47:54.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj247 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj248 = obj250 ']' 2026-03-08T22:47:54.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj248 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.291 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj249 = obj250 ']' 2026-03-08T22:47:54.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj249 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj250 = obj250 ']' 2026-03-08T22:47:54.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:595: TEST_ec_backfill_unfound: rados -p pool-jerasure get obj250 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.338 INFO:tasks.workunit.client.0.vm04.stderr:error getting pool-jerasure/obj250: (2) No such file or directory 2026-03-08T22:47:54.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj251 = obj250 ']' 2026-03-08T22:47:54.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj251 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.364 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj252 = obj250 ']' 2026-03-08T22:47:54.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj252 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.388 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.388 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.388 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj253 = obj250 ']' 2026-03-08T22:47:54.389 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj253 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj254 = obj250 ']' 2026-03-08T22:47:54.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj254 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.436 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.436 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.437 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj255 = obj250 ']' 2026-03-08T22:47:54.437 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj255 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj256 = obj250 ']' 2026-03-08T22:47:54.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj256 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.484 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.485 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.485 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj257 = obj250 ']' 2026-03-08T22:47:54.485 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj257 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj258 = obj250 ']' 2026-03-08T22:47:54.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj258 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.531 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.532 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.532 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj259 = obj250 ']' 2026-03-08T22:47:54.532 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj259 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj260 = obj250 ']' 2026-03-08T22:47:54.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj260 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj261 = obj250 ']' 2026-03-08T22:47:54.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj261 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.612 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.613 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.613 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj262 = obj250 ']' 2026-03-08T22:47:54.613 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj262 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.636 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.637 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.637 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj263 = obj250 ']' 2026-03-08T22:47:54.637 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj263 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.659 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj264 = obj250 ']' 2026-03-08T22:47:54.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj264 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj265 = obj250 ']' 2026-03-08T22:47:54.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj265 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.709 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj266 = obj250 ']' 2026-03-08T22:47:54.709 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj266 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.731 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.731 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.731 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj267 = obj250 ']' 2026-03-08T22:47:54.731 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj267 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.755 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.755 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.755 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj268 = obj250 ']' 2026-03-08T22:47:54.755 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj268 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj269 = obj250 ']' 2026-03-08T22:47:54.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj269 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj270 = obj250 ']' 2026-03-08T22:47:54.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj270 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.831 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj271 = obj250 ']' 2026-03-08T22:47:54.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj271 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.856 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.856 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.856 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj272 = obj250 ']' 2026-03-08T22:47:54.856 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj272 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj273 = obj250 ']' 2026-03-08T22:47:54.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj273 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.901 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj274 = obj250 ']' 2026-03-08T22:47:54.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj274 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.926 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.927 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.927 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj275 = obj250 ']' 2026-03-08T22:47:54.927 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj275 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj276 = obj250 ']' 2026-03-08T22:47:54.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj276 td/test-erasure-eio/CHECK 2026-03-08T22:47:54.975 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:54.975 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:54.975 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj277 = obj250 ']' 2026-03-08T22:47:54.975 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj277 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.000 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.001 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.001 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj278 = obj250 ']' 2026-03-08T22:47:55.001 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj278 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.023 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj279 = obj250 ']' 2026-03-08T22:47:55.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj279 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.048 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.048 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.048 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj280 = obj250 ']' 2026-03-08T22:47:55.048 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj280 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.075 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.075 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.076 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj281 = obj250 ']' 2026-03-08T22:47:55.076 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj281 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj282 = obj250 ']' 2026-03-08T22:47:55.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj282 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj283 = obj250 ']' 2026-03-08T22:47:55.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj283 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.150 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.150 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj284 = obj250 ']' 2026-03-08T22:47:55.150 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj284 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj285 = obj250 ']' 2026-03-08T22:47:55.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj285 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.207 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.208 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.208 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj286 = obj250 ']' 2026-03-08T22:47:55.208 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj286 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.232 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj287 = obj250 ']' 2026-03-08T22:47:55.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj287 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.255 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.256 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.256 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj288 = obj250 ']' 2026-03-08T22:47:55.256 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj288 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj289 = obj250 ']' 2026-03-08T22:47:55.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj289 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.303 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.303 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj290 = obj250 ']' 2026-03-08T22:47:55.303 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj290 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj291 = obj250 ']' 2026-03-08T22:47:55.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj291 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.354 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.354 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj292 = obj250 ']' 2026-03-08T22:47:55.354 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj292 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.377 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj293 = obj250 ']' 2026-03-08T22:47:55.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj293 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.401 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.402 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.402 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj294 = obj250 ']' 2026-03-08T22:47:55.402 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj294 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.425 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.425 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.425 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj295 = obj250 ']' 2026-03-08T22:47:55.425 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj295 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj296 = obj250 ']' 2026-03-08T22:47:55.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj296 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.471 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.472 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.472 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj297 = obj250 ']' 2026-03-08T22:47:55.472 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj297 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.494 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj298 = obj250 ']' 2026-03-08T22:47:55.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj298 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj299 = obj250 ']' 2026-03-08T22:47:55.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj299 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:591: TEST_ec_backfill_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:47:55.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:593: TEST_ec_backfill_unfound: '[' obj300 = obj250 ']' 2026-03-08T22:47:55.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:597: TEST_ec_backfill_unfound: rados --pool pool-jerasure get obj300 td/test-erasure-eio/CHECK 2026-03-08T22:47:55.571 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:598: TEST_ec_backfill_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:602: TEST_ec_backfill_unfound: rm -f td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:47:55.573 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:604: TEST_ec_backfill_unfound: delete_erasure_coded_pool pool-jerasure 2026-03-08T22:47:55.573 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:84: delete_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:47:55.573 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:85: delete_erasure_coded_pool: ceph osd pool delete pool-jerasure pool-jerasure --yes-i-really-really-mean-it 2026-03-08T22:47:55.830 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' does not exist 2026-03-08T22:47:55.839 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:86: delete_erasure_coded_pool: ceph osd erasure-code-profile rm myprofile 2026-03-08T22:47:56.116 INFO:tasks.workunit.client.0.vm04.stderr:erasure-code-profile myprofile does not exist 2026-03-08T22:47:56.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:42: run: teardown td/test-erasure-eio 2026-03-08T22:47:56.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:47:56.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:47:56.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:47:56.126 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:47:56.126 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:47:56.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:47:56.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:47:56.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:47:56.253 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:47:56.253 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:47:56.254 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:47:56.254 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:47:56.255 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:47:56.255 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:47:56.255 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:47:56.256 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:47:56.256 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:47:56.256 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:47:56.256 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:47:56.257 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:47:56.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:47:56.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:47:56.296 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:47:56.296 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:56.296 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:47:56.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:47:56.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:47:56.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:47:56.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:32: run: for func in $funcs 2026-03-08T22:47:56.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:33: run: setup td/test-erasure-eio 2026-03-08T22:47:56.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-eio 2026-03-08T22:47:56.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-eio 2026-03-08T22:47:56.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:47:56.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:47:56.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:47:56.300 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:47:56.301 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:47:56.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:47:56.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:47:56.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:47:56.308 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:47:56.309 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:47:56.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:47:56.309 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:47:56.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:47:56.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:47:56.310 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:47:56.311 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:47:56.311 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:47:56.311 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:47:56.311 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:47:56.312 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:47:56.313 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:47:56.313 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:47:56.314 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:47:56.314 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:56.314 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:47:56.314 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:47:56.314 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:47:56.315 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:47:56.315 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-eio 2026-03-08T22:47:56.315 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:47:56.315 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:56.315 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:47:56.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.102080 2026-03-08T22:47:56.317 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:47:56.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:47:56.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:47:56.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-eio 1' TERM HUP INT 2026-03-08T22:47:56.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:34: run: run_mon td/test-erasure-eio a 2026-03-08T22:47:56.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-eio 2026-03-08T22:47:56.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:47:56.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:47:56.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:47:56.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-eio/a 2026-03-08T22:47:56.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-eio/a --run-dir=td/test-erasure-eio 2026-03-08T22:47:56.342 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:47:56.342 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:47:56.342 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:47:56.342 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:47:56.342 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:56.342 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:47:56.342 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:47:56.342 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-eio/a '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-eio/log --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:47:56.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:47:56.372 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:47:56.372 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:47:56.372 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:47:56.372 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:47:56.372 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:47:56.372 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:47:56.372 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:47:56.372 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:47:56.373 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:47:56.373 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:56.374 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:47:56.374 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:47:56.374 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:47:56.374 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get fsid 2026-03-08T22:47:56.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:47:56.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:47:56.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:47:56.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:47:56.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:47:56.423 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:47:56.423 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:47:56.423 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:47:56.423 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:47:56.423 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:56.423 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:47:56.424 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:47:56.424 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:47:56.424 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get mon_host 2026-03-08T22:47:56.472 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:35: run: run_mgr td/test-erasure-eio x 2026-03-08T22:47:56.472 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-eio 2026-03-08T22:47:56.472 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:47:56.472 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:47:56.472 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:47:56.472 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-eio/x 2026-03-08T22:47:56.472 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:47:56.581 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:47:56.581 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:47:56.581 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:47:56.582 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:47:56.582 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:56.582 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:47:56.582 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:47:56.582 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:47:56.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-eio/x '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:47:56.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:36: run: create_pool rbd 4 2026-03-08T22:47:56.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:47:56.714 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd' already exists 2026-03-08T22:47:56.723 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:47:57.724 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: get_asok_path mon.a 2026-03-08T22:47:57.724 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:47:57.724 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:47:57.725 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:47:57.725 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:57.725 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:47:57.725 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:47:57.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: CEPH_ARGS= 2026-03-08T22:47:57.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-mon.a.asok log flush 2026-03-08T22:47:57.768 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:47:57.772 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:40: run: grep 'load: jerasure.*lrc' td/test-erasure-eio/mon.a.log 2026-03-08T22:47:57.773 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:47:56.363+0000 7f8ac9e2cd80 0 load: jerasure load: lrc 2026-03-08T22:47:57.774 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:41: run: TEST_ec_object_attr_read_error td/test-erasure-eio 2026-03-08T22:47:57.774 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:359: TEST_ec_object_attr_read_error: local dir=td/test-erasure-eio 2026-03-08T22:47:57.774 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:360: TEST_ec_object_attr_read_error: local objname=myobject 2026-03-08T22:47:57.774 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:362: TEST_ec_object_attr_read_error: setup_osds 7 2026-03-08T22:47:57.774 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:47: setup_osds: local count=7 2026-03-08T22:47:57.774 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:48: setup_osds: shift 2026-03-08T22:47:57.774 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: expr 7 - 1 2026-03-08T22:47:57.775 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: seq 0 6 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 0 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/0 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/0' 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/0/journal' 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:47:57.776 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:47:57.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:47:57.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:47:57.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:47:57.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:47:57.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:47:57.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:47:57.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:47:57.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:47:57.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:47:57.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:47:57.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:47:57.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:47:57.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/0 2026-03-08T22:47:57.778 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:47:57.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=82586c53-196b-4e4b-afbe-a6118e38d14d 2026-03-08T22:47:57.778 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 82586c53-196b-4e4b-afbe-a6118e38d14d 2026-03-08T22:47:57.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 82586c53-196b-4e4b-afbe-a6118e38d14d' 2026-03-08T22:47:57.778 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:47:57.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCd/K1pbgc/LxAAq7ElnYYmX7erEvjeEW6Uig== 2026-03-08T22:47:57.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCd/K1pbgc/LxAAq7ElnYYmX7erEvjeEW6Uig=="}' 2026-03-08T22:47:57.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 82586c53-196b-4e4b-afbe-a6118e38d14d -i td/test-erasure-eio/0/new.json 2026-03-08T22:47:57.915 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:47:57.926 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/0/new.json 2026-03-08T22:47:57.927 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCd/K1pbgc/LxAAq7ElnYYmX7erEvjeEW6Uig== --osd-uuid 82586c53-196b-4e4b-afbe-a6118e38d14d 2026-03-08T22:47:57.946 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:47:57.946+0000 7f7af0d51780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:57.946 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:47:57.947+0000 7f7af0d51780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:57.950 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:47:57.950+0000 7f7af0d51780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:57.950 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:47:57.951+0000 7f7af0d51780 -1 bdev(0x5606f63f5c00 td/test-erasure-eio/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:47:57.950 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:47:57.951+0000 7f7af0d51780 -1 bluestore(td/test-erasure-eio/0) _read_fsid unparsable uuid 2026-03-08T22:48:00.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/0/keyring 2026-03-08T22:48:00.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:48:00.104 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:48:00.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:48:00.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:48:00.372 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:48:00.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:48:00.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:48:00.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:48:00.373 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:48:00.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:48:00.390 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:00.391+0000 7f542b50c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:00.398 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:00.399+0000 7f542b50c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:00.399 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:00.400+0000 7f542b50c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:00.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:48:00.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:48:00.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:48:00.583 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:48:00.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:48:00.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:48:00.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:00.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:48:00.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:00.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:48:00.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:00.982 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:00.982+0000 7f542b50c780 -1 Falling back to public interface 2026-03-08T22:48:01.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:01.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:01.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:48:01.791 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:48:01.792 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:01.792 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:48:01.848 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:01.849+0000 7f542b50c780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:48:02.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:02.779 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:02.780+0000 7f5426cab640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:48:03.011 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:48:03.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:03.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:03.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:48:03.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:03.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:48:03.232 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 6 up_thru 6 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/973280919,v1:127.0.0.1:6803/973280919] [v2:127.0.0.1:6804/973280919,v1:127.0.0.1:6805/973280919] exists,up 82586c53-196b-4e4b-afbe-a6118e38d14d 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 1 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:48:03.233 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:48:03.234 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:48:03.234 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:03.234 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:48:03.234 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:48:03.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:48:03.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:48:03.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:48:03.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:48:03.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:48:03.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:48:03.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:48:03.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:48:03.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:48:03.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:48:03.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:48:03.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:48:03.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T22:48:03.235 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:48:03.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=80176384-fe48-457a-ab43-3fec4725658a 2026-03-08T22:48:03.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 80176384-fe48-457a-ab43-3fec4725658a' 2026-03-08T22:48:03.236 INFO:tasks.workunit.client.0.vm04.stdout:add osd1 80176384-fe48-457a-ab43-3fec4725658a 2026-03-08T22:48:03.236 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:48:03.251 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCj/K1ps7bmDhAAdUl2ZrmwYrGtjgLU0R4ADA== 2026-03-08T22:48:03.251 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCj/K1ps7bmDhAAdUl2ZrmwYrGtjgLU0R4ADA=="}' 2026-03-08T22:48:03.251 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 80176384-fe48-457a-ab43-3fec4725658a -i td/test-erasure-eio/1/new.json 2026-03-08T22:48:03.478 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:48:03.489 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/1/new.json 2026-03-08T22:48:03.490 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCj/K1ps7bmDhAAdUl2ZrmwYrGtjgLU0R4ADA== --osd-uuid 80176384-fe48-457a-ab43-3fec4725658a 2026-03-08T22:48:03.512 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:03.512+0000 7f058a22e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:03.513 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:03.514+0000 7f058a22e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:03.515 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:03.516+0000 7f058a22e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:03.515 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:03.516+0000 7f058a22e780 -1 bdev(0x56159c383c00 td/test-erasure-eio/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:48:03.515 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:03.516+0000 7f058a22e780 -1 bluestore(td/test-erasure-eio/1) _read_fsid unparsable uuid 2026-03-08T22:48:05.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/1/keyring 2026-03-08T22:48:05.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:48:05.647 INFO:tasks.workunit.client.0.vm04.stdout:adding osd1 key to auth repository 2026-03-08T22:48:05.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:48:05.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:48:05.940 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:48:05.940 INFO:tasks.workunit.client.0.vm04.stdout:start osd.1 2026-03-08T22:48:05.940 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:48:05.940 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:48:05.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:48:05.943 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:48:05.959 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:05.959+0000 7f1759c0d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:05.966 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:05.967+0000 7f1759c0d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:05.967 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:05.968+0000 7f1759c0d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:06.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:48:06.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:48:06.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:48:06.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:48:06.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:48:06.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:06.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:48:06.155 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:48:06.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:06.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:48:06.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:07.030 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:07.031+0000 7f1759c0d780 -1 Falling back to public interface 2026-03-08T22:48:07.363 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:07.363 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:07.363 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:48:07.363 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:48:07.364 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:07.364 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:48:07.569 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:08.112 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:08.113+0000 7f1759c0d780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:48:08.571 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:48:08.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:08.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:08.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:48:08.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:08.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:48:08.795 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:09.798 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:48:09.798 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:09.798 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:09.798 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:48:09.798 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:09.798 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:48:10.008 INFO:tasks.workunit.client.0.vm04.stdout:osd.1 up in weight 1 up_from 12 up_thru 12 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/1425890121,v1:127.0.0.1:6811/1425890121] [v2:127.0.0.1:6812/1425890121,v1:127.0.0.1:6813/1425890121] exists,up 80176384-fe48-457a-ab43-3fec4725658a 2026-03-08T22:48:10.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:48:10.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:48:10.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:48:10.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:48:10.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 2 2026-03-08T22:48:10.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:48:10.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:48:10.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:48:10.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:48:10.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T22:48:10.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:48:10.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:48:10.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:48:10.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:48:10.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T22:48:10.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T22:48:10.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:48:10.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:48:10.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:48:10.010 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:48:10.010 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:48:10.010 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:48:10.010 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:48:10.010 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:10.010 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:48:10.010 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:48:10.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:48:10.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:48:10.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:48:10.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:48:10.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:48:10.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:48:10.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:48:10.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:48:10.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:48:10.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:48:10.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:48:10.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:48:10.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T22:48:10.012 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:48:10.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=8f1a5842-bde0-48e4-ae60-6f970e89fdbd 2026-03-08T22:48:10.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 8f1a5842-bde0-48e4-ae60-6f970e89fdbd' 2026-03-08T22:48:10.013 INFO:tasks.workunit.client.0.vm04.stdout:add osd2 8f1a5842-bde0-48e4-ae60-6f970e89fdbd 2026-03-08T22:48:10.013 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:48:10.026 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCq/K1p8s2fARAA4BfW15qF09DPDrYtgtUUfA== 2026-03-08T22:48:10.026 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCq/K1p8s2fARAA4BfW15qF09DPDrYtgtUUfA=="}' 2026-03-08T22:48:10.026 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 8f1a5842-bde0-48e4-ae60-6f970e89fdbd -i td/test-erasure-eio/2/new.json 2026-03-08T22:48:10.245 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:48:10.254 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/2/new.json 2026-03-08T22:48:10.255 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCq/K1p8s2fARAA4BfW15qF09DPDrYtgtUUfA== --osd-uuid 8f1a5842-bde0-48e4-ae60-6f970e89fdbd 2026-03-08T22:48:10.274 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:10.275+0000 7f8744a11780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:10.276 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:10.277+0000 7f8744a11780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:10.277 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:10.278+0000 7f8744a11780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:10.277 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:10.278+0000 7f8744a11780 -1 bdev(0x55566f8b1c00 td/test-erasure-eio/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:48:10.277 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:10.278+0000 7f8744a11780 -1 bluestore(td/test-erasure-eio/2) _read_fsid unparsable uuid 2026-03-08T22:48:12.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/2/keyring 2026-03-08T22:48:12.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:48:12.394 INFO:tasks.workunit.client.0.vm04.stdout:adding osd2 key to auth repository 2026-03-08T22:48:12.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:48:12.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:48:12.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:48:12.696 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T22:48:12.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:48:12.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:48:12.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:48:12.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:48:12.715 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:12.716+0000 7ffa9c413780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:12.717 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:12.718+0000 7ffa9c413780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:12.718 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:12.718+0000 7ffa9c413780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:12.917 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:48:12.918 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:48:12.918 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:48:12.918 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:48:12.918 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:48:12.918 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:12.918 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:48:12.918 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:48:12.918 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:12.918 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:48:13.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:13.293 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:13.294+0000 7ffa9c413780 -1 Falling back to public interface 2026-03-08T22:48:14.156 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:48:14.156 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:14.156 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:14.156 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:48:14.156 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:14.156 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:48:14.370 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:14.412 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:14.413+0000 7ffa9c413780 -1 osd.2 0 log_to_monitors true 2026-03-08T22:48:15.309 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:15.310+0000 7ffa97494640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T22:48:15.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:15.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:15.372 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:48:15.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:48:15.373 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:15.373 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:48:15.592 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up in weight 1 up_from 17 up_thru 17 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/3805162160,v1:127.0.0.1:6819/3805162160] [v2:127.0.0.1:6820/3805162160,v1:127.0.0.1:6821/3805162160] exists,up 8f1a5842-bde0-48e4-ae60-6f970e89fdbd 2026-03-08T22:48:15.592 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:48:15.592 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:48:15.592 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 3 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/3 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/3' 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/3/journal' 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:48:15.593 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:48:15.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:48:15.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:48:15.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:48:15.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:48:15.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:48:15.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:48:15.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:48:15.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:48:15.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:48:15.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:48:15.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:48:15.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:48:15.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/3 2026-03-08T22:48:15.595 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:48:15.596 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=46136021-4dc4-41bb-8389-8cf5717e6892 2026-03-08T22:48:15.596 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 46136021-4dc4-41bb-8389-8cf5717e6892' 2026-03-08T22:48:15.596 INFO:tasks.workunit.client.0.vm04.stdout:add osd3 46136021-4dc4-41bb-8389-8cf5717e6892 2026-03-08T22:48:15.596 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:48:15.609 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCv/K1p0q1sJBAA++UtIA3dsgoDeysMEHK4/w== 2026-03-08T22:48:15.609 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCv/K1p0q1sJBAA++UtIA3dsgoDeysMEHK4/w=="}' 2026-03-08T22:48:15.609 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 46136021-4dc4-41bb-8389-8cf5717e6892 -i td/test-erasure-eio/3/new.json 2026-03-08T22:48:15.822 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:48:15.833 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/3/new.json 2026-03-08T22:48:15.833 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCv/K1p0q1sJBAA++UtIA3dsgoDeysMEHK4/w== --osd-uuid 46136021-4dc4-41bb-8389-8cf5717e6892 2026-03-08T22:48:15.852 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:15.853+0000 7f64f8ff9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:15.854 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:15.855+0000 7f64f8ff9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:15.855 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:15.856+0000 7f64f8ff9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:15.855 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:15.856+0000 7f64f8ff9780 -1 bdev(0x5650dbbc7c00 td/test-erasure-eio/3/block) open stat got: (1) Operation not permitted 2026-03-08T22:48:15.855 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:15.856+0000 7f64f8ff9780 -1 bluestore(td/test-erasure-eio/3) _read_fsid unparsable uuid 2026-03-08T22:48:18.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/3/keyring 2026-03-08T22:48:18.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:48:18.019 INFO:tasks.workunit.client.0.vm04.stdout:adding osd3 key to auth repository 2026-03-08T22:48:18.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T22:48:18.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:48:18.305 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T22:48:18.306 INFO:tasks.workunit.client.0.vm04.stdout:start osd.3 2026-03-08T22:48:18.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:48:18.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:48:18.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:48:18.308 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:48:18.323 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:18.324+0000 7f6b7980f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:18.330 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:18.331+0000 7f6b7980f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:18.332 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:18.332+0000 7f6b7980f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:18.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T22:48:18.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:48:18.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:48:18.511 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:48:18.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:48:18.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:48:18.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:18.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:48:18.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:18.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:48:18.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:18.885 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:18.886+0000 7f6b7980f780 -1 Falling back to public interface 2026-03-08T22:48:19.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:19.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:19.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:48:19.717 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:48:19.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:19.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:48:19.743 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:19.744+0000 7f6b7980f780 -1 osd.3 0 log_to_monitors true 2026-03-08T22:48:19.945 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:20.946 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:48:20.946 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:20.946 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:20.946 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:48:20.947 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:48:20.947 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:21.176 INFO:tasks.workunit.client.0.vm04.stdout:osd.3 up in weight 1 up_from 24 up_thru 25 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/2950326098,v1:127.0.0.1:6827/2950326098] [v2:127.0.0.1:6828/2950326098,v1:127.0.0.1:6829/2950326098] exists,up 46136021-4dc4-41bb-8389-8cf5717e6892 2026-03-08T22:48:21.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:48:21.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 4 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=4 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/4 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/4' 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/4/journal' 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:48:21.177 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:48:21.178 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:21.178 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:48:21.178 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:48:21.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:48:21.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:48:21.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:48:21.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:48:21.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:48:21.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:48:21.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:48:21.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:48:21.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:48:21.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:48:21.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:48:21.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:48:21.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/4 2026-03-08T22:48:21.179 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:48:21.180 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=ca8fe351-4f7f-4b2a-9c0d-c2598fe41b92 2026-03-08T22:48:21.180 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd4 ca8fe351-4f7f-4b2a-9c0d-c2598fe41b92' 2026-03-08T22:48:21.180 INFO:tasks.workunit.client.0.vm04.stdout:add osd4 ca8fe351-4f7f-4b2a-9c0d-c2598fe41b92 2026-03-08T22:48:21.180 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:48:21.193 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQC1/K1p7GacCxAADg7L8rD9yckXeBuxJhf41Q== 2026-03-08T22:48:21.193 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQC1/K1p7GacCxAADg7L8rD9yckXeBuxJhf41Q=="}' 2026-03-08T22:48:21.193 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new ca8fe351-4f7f-4b2a-9c0d-c2598fe41b92 -i td/test-erasure-eio/4/new.json 2026-03-08T22:48:21.412 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-08T22:48:21.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/4/new.json 2026-03-08T22:48:21.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 4 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/4 --osd-journal=td/test-erasure-eio/4/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQC1/K1p7GacCxAADg7L8rD9yckXeBuxJhf41Q== --osd-uuid ca8fe351-4f7f-4b2a-9c0d-c2598fe41b92 2026-03-08T22:48:21.440 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:21.441+0000 7f09e7c3f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:21.442 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:21.443+0000 7f09e7c3f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:21.443 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:21.444+0000 7f09e7c3f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:21.443 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:21.444+0000 7f09e7c3f780 -1 bdev(0x55c0b7955c00 td/test-erasure-eio/4/block) open stat got: (1) Operation not permitted 2026-03-08T22:48:21.443 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:21.444+0000 7f09e7c3f780 -1 bluestore(td/test-erasure-eio/4) _read_fsid unparsable uuid 2026-03-08T22:48:23.562 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/4/keyring 2026-03-08T22:48:23.562 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:48:23.563 INFO:tasks.workunit.client.0.vm04.stdout:adding osd4 key to auth repository 2026-03-08T22:48:23.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd4 key to auth repository 2026-03-08T22:48:23.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/4/keyring auth add osd.4 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:48:23.848 INFO:tasks.workunit.client.0.vm04.stdout:start osd.4 2026-03-08T22:48:23.848 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.4 2026-03-08T22:48:23.848 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 4 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/4 --osd-journal=td/test-erasure-eio/4/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:48:23.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:48:23.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:48:23.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:48:23.869 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:23.869+0000 7fa4a9e2c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:23.869 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:23.871+0000 7fa4a9e2c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:23.871 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:23.871+0000 7fa4a9e2c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:24.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 4 2026-03-08T22:48:24.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:48:24.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=4 2026-03-08T22:48:24.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:48:24.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:48:24.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:24.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:48:24.060 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:48:24.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:24.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:48:24.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:24.425 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:24.426+0000 7fa4a9e2c780 -1 Falling back to public interface 2026-03-08T22:48:25.271 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:48:25.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:25.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:25.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:48:25.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:25.272 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:48:25.488 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:25.783 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:25.784+0000 7fa4a9e2c780 -1 osd.4 0 log_to_monitors true 2026-03-08T22:48:26.490 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:48:26.490 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:26.490 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:26.490 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:48:26.490 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:26.490 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:48:26.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:27.433 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:27.433+0000 7fa4a4b7b640 -1 osd.4 0 waiting for initial osdmap 2026-03-08T22:48:27.719 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:48:27.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:27.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:27.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:48:27.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:48:27.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:27.933 INFO:tasks.workunit.client.0.vm04.stdout:osd.4 up in weight 1 up_from 33 up_thru 34 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6834/2485636608,v1:127.0.0.1:6835/2485636608] [v2:127.0.0.1:6836/2485636608,v1:127.0.0.1:6837/2485636608] exists,up ca8fe351-4f7f-4b2a-9c0d-c2598fe41b92 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 5 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=5 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/5 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/5' 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/5/journal' 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:48:27.934 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:48:27.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:48:27.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:48:27.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:48:27.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:48:27.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:48:27.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:48:27.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:48:27.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:48:27.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:48:27.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:48:27.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:48:27.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:48:27.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/5 2026-03-08T22:48:27.936 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:48:27.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=e772a619-73db-47f2-a26b-6088ca893bc5 2026-03-08T22:48:27.937 INFO:tasks.workunit.client.0.vm04.stdout:add osd5 e772a619-73db-47f2-a26b-6088ca893bc5 2026-03-08T22:48:27.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd5 e772a619-73db-47f2-a26b-6088ca893bc5' 2026-03-08T22:48:27.937 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:48:27.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQC7/K1pfTutOBAA0jxMjsZbFztRGrvorxt1wA== 2026-03-08T22:48:27.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQC7/K1pfTutOBAA0jxMjsZbFztRGrvorxt1wA=="}' 2026-03-08T22:48:27.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new e772a619-73db-47f2-a26b-6088ca893bc5 -i td/test-erasure-eio/5/new.json 2026-03-08T22:48:28.162 INFO:tasks.workunit.client.0.vm04.stdout:5 2026-03-08T22:48:28.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/5/new.json 2026-03-08T22:48:28.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 5 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/5 --osd-journal=td/test-erasure-eio/5/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQC7/K1pfTutOBAA0jxMjsZbFztRGrvorxt1wA== --osd-uuid e772a619-73db-47f2-a26b-6088ca893bc5 2026-03-08T22:48:28.190 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:28.191+0000 7fd7ed960780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:28.192 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:28.193+0000 7fd7ed960780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:28.192 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:28.194+0000 7fd7ed960780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:28.193 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:28.194+0000 7fd7ed960780 -1 bdev(0x560c99229c00 td/test-erasure-eio/5/block) open stat got: (1) Operation not permitted 2026-03-08T22:48:28.193 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:28.194+0000 7fd7ed960780 -1 bluestore(td/test-erasure-eio/5) _read_fsid unparsable uuid 2026-03-08T22:48:30.812 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/5/keyring 2026-03-08T22:48:30.812 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:48:30.813 INFO:tasks.workunit.client.0.vm04.stdout:adding osd5 key to auth repository 2026-03-08T22:48:30.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd5 key to auth repository 2026-03-08T22:48:30.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/5/keyring auth add osd.5 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:48:31.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.5 2026-03-08T22:48:31.089 INFO:tasks.workunit.client.0.vm04.stdout:start osd.5 2026-03-08T22:48:31.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 5 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/5 --osd-journal=td/test-erasure-eio/5/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:48:31.090 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:48:31.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:48:31.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:48:31.106 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:31.107+0000 7f1e6c20e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:31.112 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:31.113+0000 7f1e6c20e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:31.114 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:31.114+0000 7f1e6c20e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:31.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 5 2026-03-08T22:48:31.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:48:31.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=5 2026-03-08T22:48:31.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:48:31.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:48:31.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:31.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:48:31.292 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:48:31.293 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:31.293 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:48:31.494 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:32.200 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:32.201+0000 7f1e6c20e780 -1 Falling back to public interface 2026-03-08T22:48:32.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:32.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:32.495 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:48:32.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:48:32.496 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:32.496 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:48:32.710 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:33.065 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:33.066+0000 7f1e6c20e780 -1 osd.5 0 log_to_monitors true 2026-03-08T22:48:33.712 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:48:33.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:33.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:33.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:48:33.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:48:33.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:33.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:34.325 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:34.326+0000 7f1e672d5640 -1 osd.5 0 waiting for initial osdmap 2026-03-08T22:48:34.932 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:48:34.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:34.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:34.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:48:34.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:34.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:48:35.152 INFO:tasks.workunit.client.0.vm04.stdout:osd.5 up in weight 1 up_from 42 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6842/675627219,v1:127.0.0.1:6843/675627219] [v2:127.0.0.1:6844/675627219,v1:127.0.0.1:6845/675627219] exists,up e772a619-73db-47f2-a26b-6088ca893bc5 2026-03-08T22:48:35.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:48:35.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:48:35.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:48:35.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:48:35.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 6 2026-03-08T22:48:35.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:48:35.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:48:35.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=6 2026-03-08T22:48:35.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:48:35.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/6 2026-03-08T22:48:35.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:48:35.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:48:35.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:48:35.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:48:35.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/6' 2026-03-08T22:48:35.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/6/journal' 2026-03-08T22:48:35.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:48:35.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:48:35.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:48:35.154 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:48:35.154 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:48:35.154 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:48:35.154 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:48:35.154 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:35.154 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:48:35.154 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:48:35.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:48:35.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:48:35.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:48:35.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:48:35.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:48:35.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:48:35.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:48:35.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:48:35.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:48:35.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:48:35.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:48:35.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:48:35.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/6 2026-03-08T22:48:35.155 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:48:35.156 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=c6b0e8c5-f158-4251-b2cf-0de13eafa028 2026-03-08T22:48:35.156 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd6 c6b0e8c5-f158-4251-b2cf-0de13eafa028' 2026-03-08T22:48:35.156 INFO:tasks.workunit.client.0.vm04.stdout:add osd6 c6b0e8c5-f158-4251-b2cf-0de13eafa028 2026-03-08T22:48:35.156 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:48:35.172 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDD/K1pLvJUChAAEw95Bp+juPm5Z7Yw5UGrrQ== 2026-03-08T22:48:35.172 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDD/K1pLvJUChAAEw95Bp+juPm5Z7Yw5UGrrQ=="}' 2026-03-08T22:48:35.172 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new c6b0e8c5-f158-4251-b2cf-0de13eafa028 -i td/test-erasure-eio/6/new.json 2026-03-08T22:48:35.384 INFO:tasks.workunit.client.0.vm04.stdout:6 2026-03-08T22:48:35.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/6/new.json 2026-03-08T22:48:35.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 6 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/6 --osd-journal=td/test-erasure-eio/6/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDD/K1pLvJUChAAEw95Bp+juPm5Z7Yw5UGrrQ== --osd-uuid c6b0e8c5-f158-4251-b2cf-0de13eafa028 2026-03-08T22:48:35.412 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:35.413+0000 7f7e3b50c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:35.414 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:35.415+0000 7f7e3b50c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:35.415 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:35.416+0000 7f7e3b50c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:35.416 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:35.417+0000 7f7e3b50c780 -1 bdev(0x55c459bfdc00 td/test-erasure-eio/6/block) open stat got: (1) Operation not permitted 2026-03-08T22:48:35.416 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:35.417+0000 7f7e3b50c780 -1 bluestore(td/test-erasure-eio/6) _read_fsid unparsable uuid 2026-03-08T22:48:37.542 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/6/keyring 2026-03-08T22:48:37.542 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:48:37.543 INFO:tasks.workunit.client.0.vm04.stdout:adding osd6 key to auth repository 2026-03-08T22:48:37.543 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd6 key to auth repository 2026-03-08T22:48:37.543 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/6/keyring auth add osd.6 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:48:37.819 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.6 2026-03-08T22:48:37.819 INFO:tasks.workunit.client.0.vm04.stdout:start osd.6 2026-03-08T22:48:37.819 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 6 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/6 --osd-journal=td/test-erasure-eio/6/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:48:37.820 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:48:37.820 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:48:37.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:48:37.836 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:37.837+0000 7fbaa243f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:37.844 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:37.845+0000 7fbaa243f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:37.845 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:37.846+0000 7fbaa243f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:38.026 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 6 2026-03-08T22:48:38.026 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:48:38.026 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=6 2026-03-08T22:48:38.026 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:48:38.026 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:48:38.026 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:38.026 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:48:38.026 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:48:38.026 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:38.026 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:48:38.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:38.931 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:38.932+0000 7fbaa243f780 -1 Falling back to public interface 2026-03-08T22:48:39.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:39.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:39.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:48:39.233 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:48:39.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:39.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:48:39.440 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:39.792 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:39.793+0000 7fbaa243f780 -1 osd.6 0 log_to_monitors true 2026-03-08T22:48:40.443 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:48:40.443 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:40.443 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:40.443 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:48:40.443 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:40.443 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:48:40.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:41.028 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:41.029+0000 7fba9dbe0640 -1 osd.6 0 waiting for initial osdmap 2026-03-08T22:48:41.671 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:48:41.671 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:41.671 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:41.671 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:48:41.671 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:41.671 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:48:41.877 INFO:tasks.workunit.client.0.vm04.stdout:osd.6 up in weight 1 up_from 50 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6850/2029037375,v1:127.0.0.1:6851/2029037375] [v2:127.0.0.1:6852/2029037375,v1:127.0.0.1:6853/2029037375] exists,up c6b0e8c5-f158-4251-b2cf-0de13eafa028 2026-03-08T22:48:41.878 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:48:41.878 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:48:41.878 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:48:41.878 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: get_asok_path osd.0 2026-03-08T22:48:41.878 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:48:41.878 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:48:41.878 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:48:41.878 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:41.878 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:48:41.879 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.0.asok 2026-03-08T22:48:41.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: CEPH_ARGS= 2026-03-08T22:48:41.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.0.asok log flush 2026-03-08T22:48:41.928 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:48:41.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:56: setup_osds: grep 'load: jerasure.*lrc' td/test-erasure-eio/osd.0.log 2026-03-08T22:48:41.935 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:48:00.985+0000 7f542b50c780 0 load: jerasure load: lrc 2026-03-08T22:48:41.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:364: TEST_ec_object_attr_read_error: local poolname=pool-jerasure 2026-03-08T22:48:41.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:365: TEST_ec_object_attr_read_error: create_erasure_coded_pool pool-jerasure 3 2 2026-03-08T22:48:41.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:67: create_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:48:41.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:68: create_erasure_coded_pool: shift 2026-03-08T22:48:41.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:69: create_erasure_coded_pool: local k=3 2026-03-08T22:48:41.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:70: create_erasure_coded_pool: shift 2026-03-08T22:48:41.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:71: create_erasure_coded_pool: local m=2 2026-03-08T22:48:41.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:72: create_erasure_coded_pool: shift 2026-03-08T22:48:41.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:74: create_erasure_coded_pool: ceph osd erasure-code-profile set myprofile plugin=jerasure k=3 m=2 crush-failure-domain=osd 2026-03-08T22:48:42.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:78: create_erasure_coded_pool: create_pool pool-jerasure 1 1 erasure myprofile 2026-03-08T22:48:42.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create pool-jerasure 1 1 erasure myprofile 2026-03-08T22:48:42.558 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' already exists 2026-03-08T22:48:42.567 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:48:43.568 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:80: create_erasure_coded_pool: wait_for_clean 2026-03-08T22:48:43.569 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:48:43.569 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:48:43.569 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:48:43.569 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:48:43.569 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:48:43.569 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:48:43.569 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:48:43.569 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:48:43.569 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:48:43.636 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:48:43.636 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:48:43.636 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:48:43.636 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:48:43.636 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:48:43.636 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:48:43.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:48:43.844 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:48:43.844 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:48:43.844 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:48:43.844 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:48:43.844 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:48:43.844 INFO:tasks.workunit.client.0.vm04.stderr:6' 2026-03-08T22:48:43.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:48:43.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:43.844 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:48:43.911 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803787 2026-03-08T22:48:43.911 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803787 2026-03-08T22:48:43.911 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787' 2026-03-08T22:48:43.911 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:43.912 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:48:43.979 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=51539607561 2026-03-08T22:48:43.979 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 51539607561 2026-03-08T22:48:43.979 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-51539607561' 2026-03-08T22:48:43.979 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:43.979 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:48:44.046 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=73014444040 2026-03-08T22:48:44.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 73014444040 2026-03-08T22:48:44.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-51539607561 2-73014444040' 2026-03-08T22:48:44.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:44.047 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:48:44.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=103079215111 2026-03-08T22:48:44.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 103079215111 2026-03-08T22:48:44.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-51539607561 2-73014444040 3-103079215111' 2026-03-08T22:48:44.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:44.117 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:48:44.184 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=141733920774 2026-03-08T22:48:44.184 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 141733920774 2026-03-08T22:48:44.184 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-51539607561 2-73014444040 3-103079215111 4-141733920774' 2026-03-08T22:48:44.184 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:44.185 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:48:44.252 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=180388626436 2026-03-08T22:48:44.252 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 180388626436 2026-03-08T22:48:44.252 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-51539607561 2-73014444040 3-103079215111 4-141733920774 5-180388626436' 2026-03-08T22:48:44.252 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:44.253 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T22:48:44.320 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=214748364803 2026-03-08T22:48:44.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 214748364803 2026-03-08T22:48:44.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-51539607561 2-73014444040 3-103079215111 4-141733920774 5-180388626436 6-214748364803' 2026-03-08T22:48:44.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:44.321 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803787 2026-03-08T22:48:44.321 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:44.322 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:48:44.322 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803787 2026-03-08T22:48:44.322 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:44.323 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803787 2026-03-08T22:48:44.323 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803787' 2026-03-08T22:48:44.323 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803787 2026-03-08T22:48:44.323 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:48:44.530 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803784 -lt 25769803787 2026-03-08T22:48:44.530 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:48:45.531 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:48:45.532 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:48:45.740 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803787 -lt 25769803787 2026-03-08T22:48:45.740 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:45.740 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-51539607561 2026-03-08T22:48:45.740 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:45.741 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:48:45.741 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-51539607561 2026-03-08T22:48:45.742 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:45.742 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=51539607561 2026-03-08T22:48:45.742 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 51539607561' 2026-03-08T22:48:45.742 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 51539607561 2026-03-08T22:48:45.743 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:48:45.959 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 51539607562 -lt 51539607561 2026-03-08T22:48:45.959 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:45.960 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-73014444040 2026-03-08T22:48:45.960 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:45.961 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:48:45.961 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-73014444040 2026-03-08T22:48:45.961 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:45.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=73014444040 2026-03-08T22:48:45.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 73014444040' 2026-03-08T22:48:45.962 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 73014444040 2026-03-08T22:48:45.962 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:48:46.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 73014444040 -lt 73014444040 2026-03-08T22:48:46.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:46.177 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-103079215111 2026-03-08T22:48:46.177 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:46.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:48:46.178 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-103079215111 2026-03-08T22:48:46.178 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:46.179 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=103079215111 2026-03-08T22:48:46.179 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 103079215111' 2026-03-08T22:48:46.179 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 103079215111 2026-03-08T22:48:46.179 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:48:46.392 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 103079215111 -lt 103079215111 2026-03-08T22:48:46.392 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:46.393 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-141733920774 2026-03-08T22:48:46.393 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:46.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:48:46.394 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-141733920774 2026-03-08T22:48:46.394 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:46.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=141733920774 2026-03-08T22:48:46.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 141733920774' 2026-03-08T22:48:46.395 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.4 seq 141733920774 2026-03-08T22:48:46.395 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:48:46.603 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 141733920774 -lt 141733920774 2026-03-08T22:48:46.603 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:46.603 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-180388626436 2026-03-08T22:48:46.603 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:46.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:48:46.605 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-180388626436 2026-03-08T22:48:46.605 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:46.605 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=180388626436 2026-03-08T22:48:46.605 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 180388626436' 2026-03-08T22:48:46.606 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.5 seq 180388626436 2026-03-08T22:48:46.606 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:48:46.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 180388626437 -lt 180388626436 2026-03-08T22:48:46.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:46.826 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-214748364803 2026-03-08T22:48:46.827 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:46.827 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T22:48:46.828 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-214748364803 2026-03-08T22:48:46.828 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:46.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=214748364803 2026-03-08T22:48:46.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 214748364803' 2026-03-08T22:48:46.829 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.6 seq 214748364803 2026-03-08T22:48:46.829 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T22:48:47.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 214748364803 -lt 214748364803 2026-03-08T22:48:47.034 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:48:47.034 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:48:47.034 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:48:47.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:48:47.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:48:47.325 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:48:47.325 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:48:47.325 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:48:47.325 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:48:47.326 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:48:47.326 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:48:47.533 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:48:47.534 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:48:47.534 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:48:47.534 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:48:47.818 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:48:47.818 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:48:47.818 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:48:47.818 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:367: TEST_ec_object_attr_read_error: get_primary pool-jerasure myobject 2026-03-08T22:48:47.818 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=pool-jerasure 2026-03-08T22:48:47.818 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=myobject 2026-03-08T22:48:47.818 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map pool-jerasure myobject 2026-03-08T22:48:47.818 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:48:48.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:367: TEST_ec_object_attr_read_error: local primary_osd=3 2026-03-08T22:48:48.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:369: TEST_ec_object_attr_read_error: kill_daemons td/test-erasure-eio TERM osd.3 2026-03-08T22:48:48.035 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:48:48.035 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:48:48.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:48:48.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:48:48.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:48:48.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:48:48.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:372: TEST_ec_object_attr_read_error: rados_put td/test-erasure-eio pool-jerasure myobject 2026-03-08T22:48:48.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T22:48:48.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T22:48:48.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=myobject 2026-03-08T22:48:48.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:48:48.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T22:48:48.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:48:48.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T22:48:48.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:48:48.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T22:48:48.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:48:48.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T22:48:48.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put myobject td/test-erasure-eio/ORIGINAL 2026-03-08T22:48:48.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:375: TEST_ec_object_attr_read_error: inject_eio ec mdata pool-jerasure myobject td/test-erasure-eio 1 2026-03-08T22:48:48.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=ec 2026-03-08T22:48:48.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T22:48:48.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=mdata 2026-03-08T22:48:48.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T22:48:48.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=pool-jerasure 2026-03-08T22:48:48.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T22:48:48.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=myobject 2026-03-08T22:48:48.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T22:48:48.680 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/test-erasure-eio 2026-03-08T22:48:48.680 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T22:48:48.680 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=1 2026-03-08T22:48:48.680 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T22:48:48.680 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds pool-jerasure myobject 2026-03-08T22:48:48.680 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:48:48.680 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=myobject 2026-03-08T22:48:48.681 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure myobject 2026-03-08T22:48:48.681 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:48:48.901 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=2147483647 2026-03-08T22:48:48.901 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:48:48.901 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:48:48.901 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:48:48.901 INFO:tasks.workunit.client.0.vm04.stderr:6' 2026-03-08T22:48:48.901 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 2147483647 5 1 0 6 2026-03-08T22:48:48.901 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('2147483647' '5' '1' '0' '6') 2026-03-08T22:48:48.901 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T22:48:48.901 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=5 2026-03-08T22:48:48.901 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' ec '!=' ec ']' 2026-03-08T22:48:48.901 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/test-erasure-eio/5/type 2026-03-08T22:48:48.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T22:48:48.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 5 bluestore_debug_inject_read_err true 2026-03-08T22:48:48.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T22:48:48.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=5 2026-03-08T22:48:48.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T22:48:48.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T22:48:48.903 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T22:48:48.903 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.5 2026-03-08T22:48:48.903 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.5 2026-03-08T22:48:48.903 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.5 ']' 2026-03-08T22:48:48.903 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:48:48.903 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:48.903 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:48:48.903 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.5.asok 2026-03-08T22:48:48.904 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.102080/ceph-osd.5.asok config set bluestore_debug_inject_read_err true 2026-03-08T22:48:48.957 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T22:48:48.957 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T22:48:48.957 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T22:48:48.957 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.5 2026-03-08T22:48:48.957 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.5 2026-03-08T22:48:48.957 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.5 ']' 2026-03-08T22:48:48.957 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:48:48.957 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:48.957 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:48:48.958 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.5.asok 2026-03-08T22:48:48.958 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T22:48:48.958 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.5.asok injectmdataerr pool-jerasure myobject 1 2026-03-08T22:48:49.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:378: TEST_ec_object_attr_read_error: activate_osd td/test-erasure-eio 3 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/3 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/3' 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/3/journal' 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:48:49.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:48:49.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:48:49.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:48:49.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:48:49.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:48:49.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/3 2026-03-08T22:48:49.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T22:48:49.014 INFO:tasks.workunit.client.0.vm04.stdout:start osd.3 2026-03-08T22:48:49.015 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:48:49.015 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/3/whoami 2026-03-08T22:48:49.016 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T22:48:49.016 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:48:49.017 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:48:49.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:48:49.034 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:49.033+0000 7efcfac11780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:49.034 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:49.035+0000 7efcfac11780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:49.036 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:49.036+0000 7efcfac11780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:49.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T22:48:49.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:48:49.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:48:49.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:48:49.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:48:49.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:49.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:48:49.230 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:48:49.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:49.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:48:49.443 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:49.848 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:49.848+0000 7efcfac11780 -1 Falling back to public interface 2026-03-08T22:48:50.444 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:50.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:50.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:48:50.445 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:48:50.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:50.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:48:50.650 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:50.718 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:50.719+0000 7efcfac11780 -1 osd.3 58 log_to_monitors true 2026-03-08T22:48:51.652 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:48:51.652 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:51.652 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:51.652 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:48:51.652 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:51.653 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:48:51.872 INFO:tasks.workunit.client.0.vm04.stdout:osd.3 up in weight 1 up_from 62 up_thru 62 down_at 59 last_clean_interval [24,58) [v2:127.0.0.1:6826/454625140,v1:127.0.0.1:6827/454625140] [v2:127.0.0.1:6828/454625140,v1:127.0.0.1:6829/454625140] exists,up 46136021-4dc4-41bb-8389-8cf5717e6892 2026-03-08T22:48:51.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:48:51.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:48:51.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:48:51.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:381: TEST_ec_object_attr_read_error: wait_for_clean 2026-03-08T22:48:51.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:48:51.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:48:51.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:48:51.872 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:48:51.873 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:48:51.873 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:48:51.873 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:48:51.873 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:48:51.873 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:48:51.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:48:51.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:48:51.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:48:51.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:48:51.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:48:51.948 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:48:52.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:48:52.161 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:48:52.161 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:48:52.161 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:48:52.161 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:48:52.161 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:48:52.161 INFO:tasks.workunit.client.0.vm04.stderr:6' 2026-03-08T22:48:52.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:48:52.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:52.162 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:48:52.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803790 2026-03-08T22:48:52.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803790 2026-03-08T22:48:52.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803790' 2026-03-08T22:48:52.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:52.247 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:48:52.318 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=51539607565 2026-03-08T22:48:52.318 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 51539607565 2026-03-08T22:48:52.318 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803790 1-51539607565' 2026-03-08T22:48:52.319 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:52.319 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:48:52.392 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=73014444044 2026-03-08T22:48:52.392 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 73014444044 2026-03-08T22:48:52.392 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803790 1-51539607565 2-73014444044' 2026-03-08T22:48:52.392 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:52.392 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:48:52.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=266287972355 2026-03-08T22:48:52.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 266287972355 2026-03-08T22:48:52.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803790 1-51539607565 2-73014444044 3-266287972355' 2026-03-08T22:48:52.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:52.461 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:48:52.528 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=141733920777 2026-03-08T22:48:52.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 141733920777 2026-03-08T22:48:52.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803790 1-51539607565 2-73014444044 3-266287972355 4-141733920777' 2026-03-08T22:48:52.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:52.529 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:48:52.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=180388626440 2026-03-08T22:48:52.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 180388626440 2026-03-08T22:48:52.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803790 1-51539607565 2-73014444044 3-266287972355 4-141733920777 5-180388626440' 2026-03-08T22:48:52.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:52.595 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T22:48:52.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=214748364807 2026-03-08T22:48:52.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 214748364807 2026-03-08T22:48:52.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803790 1-51539607565 2-73014444044 3-266287972355 4-141733920777 5-180388626440 6-214748364807' 2026-03-08T22:48:52.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:52.664 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803790 2026-03-08T22:48:52.664 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:52.665 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:48:52.666 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803790 2026-03-08T22:48:52.666 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:52.667 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803790 2026-03-08T22:48:52.667 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803790' 2026-03-08T22:48:52.667 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803790 2026-03-08T22:48:52.667 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:48:52.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803790 -lt 25769803790 2026-03-08T22:48:52.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:52.903 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-51539607565 2026-03-08T22:48:52.903 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:52.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:48:52.905 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-51539607565 2026-03-08T22:48:52.905 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:52.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=51539607565 2026-03-08T22:48:52.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 51539607565' 2026-03-08T22:48:52.906 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 51539607565 2026-03-08T22:48:52.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:48:53.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 51539607565 -lt 51539607565 2026-03-08T22:48:53.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:53.126 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-73014444044 2026-03-08T22:48:53.127 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:53.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:48:53.128 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-73014444044 2026-03-08T22:48:53.128 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:53.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=73014444044 2026-03-08T22:48:53.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 73014444044' 2026-03-08T22:48:53.129 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 73014444044 2026-03-08T22:48:53.129 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:48:53.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 73014444044 -lt 73014444044 2026-03-08T22:48:53.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:53.345 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-266287972355 2026-03-08T22:48:53.345 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:53.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:48:53.346 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-266287972355 2026-03-08T22:48:53.347 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:53.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=266287972355 2026-03-08T22:48:53.347 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 266287972355' 2026-03-08T22:48:53.347 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 266287972355 2026-03-08T22:48:53.348 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:48:53.552 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 266287972355 -lt 266287972355 2026-03-08T22:48:53.552 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:53.552 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-141733920777 2026-03-08T22:48:53.553 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:53.553 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:48:53.554 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-141733920777 2026-03-08T22:48:53.554 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:53.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=141733920777 2026-03-08T22:48:53.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 141733920777' 2026-03-08T22:48:53.555 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.4 seq 141733920777 2026-03-08T22:48:53.555 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:48:53.762 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 141733920777 -lt 141733920777 2026-03-08T22:48:53.762 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:53.762 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-180388626440 2026-03-08T22:48:53.762 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:53.763 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:48:53.764 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-180388626440 2026-03-08T22:48:53.764 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:53.764 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=180388626440 2026-03-08T22:48:53.764 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 180388626440' 2026-03-08T22:48:53.764 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.5 seq 180388626440 2026-03-08T22:48:53.765 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:48:53.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 180388626440 -lt 180388626440 2026-03-08T22:48:53.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:53.977 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-214748364807 2026-03-08T22:48:53.977 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:53.978 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T22:48:53.979 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-214748364807 2026-03-08T22:48:53.979 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:53.979 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=214748364807 2026-03-08T22:48:53.980 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 214748364807' 2026-03-08T22:48:53.980 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.6 seq 214748364807 2026-03-08T22:48:53.980 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T22:48:54.187 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 214748364807 -lt 214748364807 2026-03-08T22:48:54.187 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:48:54.187 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:48:54.187 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:48:54.467 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:48:54.467 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:48:54.467 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:48:54.467 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:48:54.467 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:48:54.467 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:48:54.467 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:48:54.467 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:48:54.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:48:54.672 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:48:54.672 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:48:54.672 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:48:54.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:48:54.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:48:54.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:48:54.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:383: TEST_ec_object_attr_read_error: rados_get td/test-erasure-eio pool-jerasure myobject 2026-03-08T22:48:54.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:48:54.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:48:54.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=myobject 2026-03-08T22:48:54.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=ok 2026-03-08T22:48:54.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' ok = fail ']' 2026-03-08T22:48:54.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:120: rados_get: rados --pool pool-jerasure get myobject td/test-erasure-eio/COPY 2026-03-08T22:48:54.966 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:121: rados_get: diff td/test-erasure-eio/ORIGINAL td/test-erasure-eio/COPY 2026-03-08T22:48:54.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:122: rados_get: rm td/test-erasure-eio/COPY 2026-03-08T22:48:54.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:385: TEST_ec_object_attr_read_error: delete_erasure_coded_pool pool-jerasure 2026-03-08T22:48:54.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:84: delete_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:48:54.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:85: delete_erasure_coded_pool: ceph osd pool delete pool-jerasure pool-jerasure --yes-i-really-really-mean-it 2026-03-08T22:48:55.269 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' does not exist 2026-03-08T22:48:55.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:86: delete_erasure_coded_pool: ceph osd erasure-code-profile rm myprofile 2026-03-08T22:48:55.548 INFO:tasks.workunit.client.0.vm04.stderr:erasure-code-profile myprofile does not exist 2026-03-08T22:48:55.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:42: run: teardown td/test-erasure-eio 2026-03-08T22:48:55.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:48:55.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:48:55.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:48:55.558 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:48:55.558 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:48:55.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:48:55.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:48:55.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:48:55.689 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:48:55.689 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:48:55.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:48:55.690 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:48:55.691 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:48:55.691 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:48:55.692 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:48:55.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:48:55.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:48:55.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:48:55.693 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:48:55.693 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:48:55.694 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:48:55.694 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:48:55.720 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:48:55.720 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:55.720 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:48:55.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:48:55.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:48:55.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:48:55.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:32: run: for func in $funcs 2026-03-08T22:48:55.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:33: run: setup td/test-erasure-eio 2026-03-08T22:48:55.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-eio 2026-03-08T22:48:55.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-eio 2026-03-08T22:48:55.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:48:55.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:48:55.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:48:55.725 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:48:55.725 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:48:55.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:48:55.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:48:55.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:48:55.726 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:48:55.726 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:48:55.727 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:48:55.727 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:48:55.728 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:48:55.728 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:48:55.728 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:48:55.728 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:48:55.728 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:48:55.729 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:48:55.729 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:48:55.729 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:48:55.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:48:55.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:48:55.731 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:48:55.731 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:55.731 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:48:55.731 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:48:55.732 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:48:55.732 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:48:55.732 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-eio 2026-03-08T22:48:55.733 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:48:55.733 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:55.733 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:48:55.733 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.102080 2026-03-08T22:48:55.734 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:48:55.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:48:55.735 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:48:55.735 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-eio 1' TERM HUP INT 2026-03-08T22:48:55.735 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:34: run: run_mon td/test-erasure-eio a 2026-03-08T22:48:55.735 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-eio 2026-03-08T22:48:55.735 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:48:55.735 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:48:55.735 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:48:55.735 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-eio/a 2026-03-08T22:48:55.735 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-eio/a --run-dir=td/test-erasure-eio 2026-03-08T22:48:55.774 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:48:55.774 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:48:55.774 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:48:55.774 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:48:55.774 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:55.774 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:48:55.774 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:48:55.774 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-eio/a '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-eio/log --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:48:55.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:48:55.806 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:48:55.806 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:48:55.806 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:48:55.806 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:48:55.806 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:48:55.807 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:48:55.807 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:48:55.807 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:48:55.809 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:48:55.809 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:55.809 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:48:55.809 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:48:55.810 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:48:55.810 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get fsid 2026-03-08T22:48:55.861 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:48:55.861 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:48:55.861 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:48:55.861 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:48:55.862 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:48:55.862 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:48:55.862 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:48:55.862 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:48:55.862 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:48:55.862 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:55.862 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:48:55.862 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:48:55.862 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:48:55.862 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get mon_host 2026-03-08T22:48:55.914 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:35: run: run_mgr td/test-erasure-eio x 2026-03-08T22:48:55.914 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-eio 2026-03-08T22:48:55.914 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:48:55.914 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:48:55.914 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:48:55.914 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-eio/x 2026-03-08T22:48:55.914 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:48:56.023 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:48:56.023 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:48:56.023 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:48:56.023 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:48:56.023 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:56.023 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:48:56.023 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:48:56.024 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:48:56.025 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-eio/x '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:48:56.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:36: run: create_pool rbd 4 2026-03-08T22:48:56.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:48:56.161 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd' already exists 2026-03-08T22:48:56.172 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:48:57.174 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: get_asok_path mon.a 2026-03-08T22:48:57.174 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:48:57.174 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:48:57.174 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:48:57.174 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:57.174 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:48:57.174 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:48:57.174 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: CEPH_ARGS= 2026-03-08T22:48:57.174 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-mon.a.asok log flush 2026-03-08T22:48:57.220 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:48:57.224 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:40: run: grep 'load: jerasure.*lrc' td/test-erasure-eio/mon.a.log 2026-03-08T22:48:57.225 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:48:55.797+0000 7fe0227cdd80 0 load: jerasure load: lrc 2026-03-08T22:48:57.225 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:41: run: TEST_ec_recovery_multiple_errors td/test-erasure-eio 2026-03-08T22:48:57.226 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:418: TEST_ec_recovery_multiple_errors: local dir=td/test-erasure-eio 2026-03-08T22:48:57.226 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:419: TEST_ec_recovery_multiple_errors: local objname=myobject 2026-03-08T22:48:57.226 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:421: TEST_ec_recovery_multiple_errors: setup_osds 9 2026-03-08T22:48:57.226 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:47: setup_osds: local count=9 2026-03-08T22:48:57.226 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:48: setup_osds: shift 2026-03-08T22:48:57.226 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: expr 9 - 1 2026-03-08T22:48:57.227 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: seq 0 8 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 0 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/0 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/0' 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/0/journal' 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:48:57.228 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:48:57.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:48:57.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:48:57.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:48:57.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:48:57.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:48:57.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:48:57.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:48:57.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:48:57.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:48:57.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:48:57.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:48:57.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:48:57.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/0 2026-03-08T22:48:57.230 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:48:57.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=361d1f82-3a12-44ec-a735-bbbddcf082f0 2026-03-08T22:48:57.231 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 361d1f82-3a12-44ec-a735-bbbddcf082f0 2026-03-08T22:48:57.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 361d1f82-3a12-44ec-a735-bbbddcf082f0' 2026-03-08T22:48:57.231 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:48:57.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDZ/K1p79GXDhAApr+m4l9+en4wFwDgAiptcw== 2026-03-08T22:48:57.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDZ/K1p79GXDhAApr+m4l9+en4wFwDgAiptcw=="}' 2026-03-08T22:48:57.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 361d1f82-3a12-44ec-a735-bbbddcf082f0 -i td/test-erasure-eio/0/new.json 2026-03-08T22:48:57.367 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:48:57.387 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/0/new.json 2026-03-08T22:48:57.387 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDZ/K1p79GXDhAApr+m4l9+en4wFwDgAiptcw== --osd-uuid 361d1f82-3a12-44ec-a735-bbbddcf082f0 2026-03-08T22:48:57.410 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:57.409+0000 7f1c8ffa7780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:57.410 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:57.411+0000 7f1c8ffa7780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:57.411 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:57.412+0000 7f1c8ffa7780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:57.412 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:57.412+0000 7f1c8ffa7780 -1 bdev(0x5611d19bc800 td/test-erasure-eio/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:48:57.412 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:57.412+0000 7f1c8ffa7780 -1 bluestore(td/test-erasure-eio/0) _read_fsid unparsable uuid 2026-03-08T22:48:59.553 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/0/keyring 2026-03-08T22:48:59.553 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:48:59.554 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:48:59.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:48:59.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:48:59.846 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:48:59.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:48:59.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:48:59.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:48:59.848 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:48:59.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:48:59.864 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:59.865+0000 7fd8eb1d2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:59.867 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:59.868+0000 7fd8eb1d2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:59.869 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:48:59.869+0000 7fd8eb1d2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:00.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:49:00.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:49:00.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:49:00.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:49:00.062 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:49:00.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:49:00.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:00.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:49:00.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:00.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:49:00.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:01.195 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:01.196+0000 7fd8eb1d2780 -1 Falling back to public interface 2026-03-08T22:49:01.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:01.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:01.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:49:01.271 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:49:01.272 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:01.272 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:49:01.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:02.053 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:02.054+0000 7fd8eb1d2780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:49:02.471 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:49:02.471 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:02.471 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:02.471 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:49:02.471 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:02.471 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:49:02.702 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:03.706 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:49:03.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:03.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:03.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:49:03.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:03.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:49:03.915 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 6 up_thru 6 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/730641926,v1:127.0.0.1:6803/730641926] [v2:127.0.0.1:6804/730641926,v1:127.0.0.1:6805/730641926] exists,up 361d1f82-3a12-44ec-a735-bbbddcf082f0 2026-03-08T22:49:03.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:49:03.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:49:03.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:49:03.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:49:03.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 1 2026-03-08T22:49:03.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:49:03.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:49:03.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:49:03.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:49:03.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T22:49:03.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:49:03.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:49:03.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:49:03.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:49:03.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T22:49:03.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:49:03.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:49:03.917 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:49:03.917 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T22:49:03.917 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:49:03.918 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=43e54e74-7ae8-4725-8ed0-723ba1de5f6c 2026-03-08T22:49:03.918 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 43e54e74-7ae8-4725-8ed0-723ba1de5f6c' 2026-03-08T22:49:03.918 INFO:tasks.workunit.client.0.vm04.stdout:add osd1 43e54e74-7ae8-4725-8ed0-723ba1de5f6c 2026-03-08T22:49:03.918 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:49:03.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDf/K1pqTuPNxAAbblmJRryn0bDBPGFxZEeXA== 2026-03-08T22:49:03.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDf/K1pqTuPNxAAbblmJRryn0bDBPGFxZEeXA=="}' 2026-03-08T22:49:03.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 43e54e74-7ae8-4725-8ed0-723ba1de5f6c -i td/test-erasure-eio/1/new.json 2026-03-08T22:49:04.133 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:49:04.142 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/1/new.json 2026-03-08T22:49:04.143 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDf/K1pqTuPNxAAbblmJRryn0bDBPGFxZEeXA== --osd-uuid 43e54e74-7ae8-4725-8ed0-723ba1de5f6c 2026-03-08T22:49:04.161 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:04.162+0000 7f24c2013780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:04.163 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:04.164+0000 7f24c2013780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:04.164 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:04.165+0000 7f24c2013780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:04.165 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:04.166+0000 7f24c2013780 -1 bdev(0x55deac887c00 td/test-erasure-eio/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:49:04.165 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:04.166+0000 7f24c2013780 -1 bluestore(td/test-erasure-eio/1) _read_fsid unparsable uuid 2026-03-08T22:49:06.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/1/keyring 2026-03-08T22:49:06.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:49:06.307 INFO:tasks.workunit.client.0.vm04.stdout:adding osd1 key to auth repository 2026-03-08T22:49:06.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:49:06.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:49:06.588 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:49:06.589 INFO:tasks.workunit.client.0.vm04.stdout:start osd.1 2026-03-08T22:49:06.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:49:06.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:49:06.590 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:49:06.591 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:49:06.606 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:06.607+0000 7f04ef753780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:06.608 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:06.609+0000 7f04ef753780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:06.610 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:06.610+0000 7f04ef753780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:06.799 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:49:06.799 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:49:06.799 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:49:06.799 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:49:06.799 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:49:06.799 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:49:06.799 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:06.799 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:49:06.799 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:06.799 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:49:07.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:07.164 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:07.165+0000 7f04ef753780 -1 Falling back to public interface 2026-03-08T22:49:08.017 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:49:08.017 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:08.017 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:08.017 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:49:08.017 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:08.017 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:49:08.036 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:08.037+0000 7f04ef753780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:49:08.260 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:09.123 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:09.124+0000 7f04eaef2640 -1 osd.1 0 waiting for initial osdmap 2026-03-08T22:49:09.263 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:49:09.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:09.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:09.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:49:09.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:09.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:49:09.491 INFO:tasks.workunit.client.0.vm04.stdout:osd.1 up in weight 1 up_from 13 up_thru 13 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/3316413078,v1:127.0.0.1:6811/3316413078] [v2:127.0.0.1:6812/3316413078,v1:127.0.0.1:6813/3316413078] exists,up 43e54e74-7ae8-4725-8ed0-723ba1de5f6c 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 2 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:49:09.492 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:49:09.493 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:09.493 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:49:09.493 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:49:09.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:49:09.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:49:09.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:49:09.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:49:09.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:49:09.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:49:09.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:49:09.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:49:09.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:49:09.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:49:09.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:49:09.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:49:09.494 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T22:49:09.495 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:49:09.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=0d517678-1f52-4ee9-802e-c28d4022cf74 2026-03-08T22:49:09.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 0d517678-1f52-4ee9-802e-c28d4022cf74' 2026-03-08T22:49:09.495 INFO:tasks.workunit.client.0.vm04.stdout:add osd2 0d517678-1f52-4ee9-802e-c28d4022cf74 2026-03-08T22:49:09.496 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:49:09.509 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDl/K1pnMVpHhAAYKmSuam5+kTzQpabuS7TgQ== 2026-03-08T22:49:09.509 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDl/K1pnMVpHhAAYKmSuam5+kTzQpabuS7TgQ=="}' 2026-03-08T22:49:09.509 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 0d517678-1f52-4ee9-802e-c28d4022cf74 -i td/test-erasure-eio/2/new.json 2026-03-08T22:49:09.738 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:49:09.747 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/2/new.json 2026-03-08T22:49:09.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDl/K1pnMVpHhAAYKmSuam5+kTzQpabuS7TgQ== --osd-uuid 0d517678-1f52-4ee9-802e-c28d4022cf74 2026-03-08T22:49:09.767 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:09.768+0000 7fc821c0f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:09.769 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:09.770+0000 7fc821c0f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:09.770 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:09.771+0000 7fc821c0f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:09.770 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:09.771+0000 7fc821c0f780 -1 bdev(0x55b1f06a3c00 td/test-erasure-eio/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:49:09.770 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:09.771+0000 7fc821c0f780 -1 bluestore(td/test-erasure-eio/2) _read_fsid unparsable uuid 2026-03-08T22:49:11.899 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/2/keyring 2026-03-08T22:49:11.899 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:49:11.900 INFO:tasks.workunit.client.0.vm04.stdout:adding osd2 key to auth repository 2026-03-08T22:49:11.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:49:11.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:49:12.188 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:49:12.188 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T22:49:12.188 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:49:12.190 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:49:12.190 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:49:12.192 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:49:12.211 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:12.211+0000 7faf4440f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:12.212 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:12.214+0000 7faf4440f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:12.214 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:12.215+0000 7faf4440f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:12.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:49:12.413 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:49:12.413 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:49:12.413 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:49:12.413 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:49:12.413 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:12.413 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:49:12.413 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:49:12.413 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:12.413 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:49:12.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:13.016 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:13.017+0000 7faf4440f780 -1 Falling back to public interface 2026-03-08T22:49:13.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:13.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:13.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:49:13.631 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:49:13.632 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:13.632 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:49:13.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:13.878 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:13.879+0000 7faf4440f780 -1 osd.2 0 log_to_monitors true 2026-03-08T22:49:14.853 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:49:14.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:14.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:14.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:49:14.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:14.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:49:15.055 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:15.056+0000 7faf3fbae640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T22:49:15.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:16.088 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:16.088 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:16.088 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:49:16.088 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:49:16.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:16.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:49:16.298 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up in weight 1 up_from 19 up_thru 19 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/662014494,v1:127.0.0.1:6819/662014494] [v2:127.0.0.1:6820/662014494,v1:127.0.0.1:6821/662014494] exists,up 0d517678-1f52-4ee9-802e-c28d4022cf74 2026-03-08T22:49:16.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:49:16.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:49:16.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:49:16.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:49:16.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 3 2026-03-08T22:49:16.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:49:16.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:49:16.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T22:49:16.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:49:16.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/3 2026-03-08T22:49:16.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:49:16.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:49:16.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:49:16.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:49:16.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/3' 2026-03-08T22:49:16.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/3/journal' 2026-03-08T22:49:16.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:49:16.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:49:16.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:49:16.299 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:49:16.300 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:49:16.300 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:49:16.300 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:49:16.300 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:16.300 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:49:16.300 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:49:16.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:49:16.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:49:16.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:49:16.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:49:16.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:49:16.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:49:16.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:49:16.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:49:16.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:49:16.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:49:16.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:49:16.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:49:16.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/3 2026-03-08T22:49:16.301 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:49:16.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=49519a58-e62b-48db-bd83-79d084755f2b 2026-03-08T22:49:16.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 49519a58-e62b-48db-bd83-79d084755f2b' 2026-03-08T22:49:16.302 INFO:tasks.workunit.client.0.vm04.stdout:add osd3 49519a58-e62b-48db-bd83-79d084755f2b 2026-03-08T22:49:16.302 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:49:16.315 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDs/K1pbeHhEhAAXTv1I08lUPq4PMGa9tg7Yw== 2026-03-08T22:49:16.315 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDs/K1pbeHhEhAAXTv1I08lUPq4PMGa9tg7Yw=="}' 2026-03-08T22:49:16.315 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 49519a58-e62b-48db-bd83-79d084755f2b -i td/test-erasure-eio/3/new.json 2026-03-08T22:49:16.530 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:49:16.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/3/new.json 2026-03-08T22:49:16.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDs/K1pbeHhEhAAXTv1I08lUPq4PMGa9tg7Yw== --osd-uuid 49519a58-e62b-48db-bd83-79d084755f2b 2026-03-08T22:49:16.559 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:16.560+0000 7f993ed4d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:16.561 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:16.562+0000 7f993ed4d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:16.561 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:16.563+0000 7f993ed4d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:16.562 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:16.563+0000 7f993ed4d780 -1 bdev(0x557a11fffc00 td/test-erasure-eio/3/block) open stat got: (1) Operation not permitted 2026-03-08T22:49:16.562 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:16.563+0000 7f993ed4d780 -1 bluestore(td/test-erasure-eio/3) _read_fsid unparsable uuid 2026-03-08T22:49:18.700 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/3/keyring 2026-03-08T22:49:18.700 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:49:18.701 INFO:tasks.workunit.client.0.vm04.stdout:adding osd3 key to auth repository 2026-03-08T22:49:18.701 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T22:49:18.701 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:49:18.985 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T22:49:18.985 INFO:tasks.workunit.client.0.vm04.stdout:start osd.3 2026-03-08T22:49:18.985 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:49:18.985 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:49:18.986 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:49:18.988 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:49:19.005 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:19.005+0000 7f22dbd2c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:19.012 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:19.013+0000 7f22dbd2c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:19.014 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:19.014+0000 7f22dbd2c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:19.198 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T22:49:19.198 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:49:19.198 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:49:19.198 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:49:19.198 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:49:19.198 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:19.198 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:49:19.198 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:49:19.199 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:19.199 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:49:19.413 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:20.370 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:20.371+0000 7f22dbd2c780 -1 Falling back to public interface 2026-03-08T22:49:20.414 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:20.414 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:20.414 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:49:20.414 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:49:20.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:20.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:49:20.616 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:21.245 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:21.246+0000 7f22dbd2c780 -1 osd.3 0 log_to_monitors true 2026-03-08T22:49:21.618 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:49:21.618 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:21.618 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:21.618 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:49:21.619 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:21.619 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:49:21.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:22.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:22.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:22.850 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:49:22.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:49:22.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:49:22.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:23.068 INFO:tasks.workunit.client.0.vm04.stdout:osd.3 up in weight 1 up_from 27 up_thru 28 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/2201154199,v1:127.0.0.1:6827/2201154199] [v2:127.0.0.1:6828/2201154199,v1:127.0.0.1:6829/2201154199] exists,up 49519a58-e62b-48db-bd83-79d084755f2b 2026-03-08T22:49:23.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:49:23.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:49:23.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:49:23.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:49:23.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 4 2026-03-08T22:49:23.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:49:23.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:49:23.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=4 2026-03-08T22:49:23.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:49:23.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/4 2026-03-08T22:49:23.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:49:23.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:49:23.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:49:23.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:49:23.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/4' 2026-03-08T22:49:23.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/4/journal' 2026-03-08T22:49:23.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:49:23.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:49:23.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:49:23.070 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:49:23.070 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:49:23.070 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:49:23.070 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:49:23.070 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:23.070 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:49:23.070 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:49:23.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:49:23.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:49:23.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:49:23.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:49:23.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:49:23.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:49:23.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:49:23.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:49:23.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:49:23.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:49:23.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:49:23.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:49:23.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/4 2026-03-08T22:49:23.072 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:49:23.073 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=1db77cd4-11ef-406a-80bb-affaa7b341e6 2026-03-08T22:49:23.073 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd4 1db77cd4-11ef-406a-80bb-affaa7b341e6' 2026-03-08T22:49:23.073 INFO:tasks.workunit.client.0.vm04.stdout:add osd4 1db77cd4-11ef-406a-80bb-affaa7b341e6 2026-03-08T22:49:23.074 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:49:23.086 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDz/K1pxRM7BRAAfgLUnbMYD3tiH4k4cEhouw== 2026-03-08T22:49:23.086 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDz/K1pxRM7BRAAfgLUnbMYD3tiH4k4cEhouw=="}' 2026-03-08T22:49:23.086 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 1db77cd4-11ef-406a-80bb-affaa7b341e6 -i td/test-erasure-eio/4/new.json 2026-03-08T22:49:23.307 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-08T22:49:23.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/4/new.json 2026-03-08T22:49:23.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 4 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/4 --osd-journal=td/test-erasure-eio/4/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDz/K1pxRM7BRAAfgLUnbMYD3tiH4k4cEhouw== --osd-uuid 1db77cd4-11ef-406a-80bb-affaa7b341e6 2026-03-08T22:49:23.334 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:23.335+0000 7f4cecec2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:23.336 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:23.337+0000 7f4cecec2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:23.337 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:23.338+0000 7f4cecec2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:23.337 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:23.338+0000 7f4cecec2780 -1 bdev(0x5611c3fbdc00 td/test-erasure-eio/4/block) open stat got: (1) Operation not permitted 2026-03-08T22:49:23.337 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:23.338+0000 7f4cecec2780 -1 bluestore(td/test-erasure-eio/4) _read_fsid unparsable uuid 2026-03-08T22:49:25.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/4/keyring 2026-03-08T22:49:25.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:49:25.968 INFO:tasks.workunit.client.0.vm04.stdout:adding osd4 key to auth repository 2026-03-08T22:49:25.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd4 key to auth repository 2026-03-08T22:49:25.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/4/keyring auth add osd.4 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:49:26.240 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.4 2026-03-08T22:49:26.240 INFO:tasks.workunit.client.0.vm04.stdout:start osd.4 2026-03-08T22:49:26.240 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 4 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/4 --osd-journal=td/test-erasure-eio/4/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:49:26.240 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:49:26.241 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:49:26.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:49:26.258 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:26.258+0000 7f966761f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:26.265 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:26.267+0000 7f966761f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:26.267 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:26.267+0000 7f966761f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:26.442 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 4 2026-03-08T22:49:26.442 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:49:26.442 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=4 2026-03-08T22:49:26.442 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:49:26.442 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:49:26.442 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:26.442 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:49:26.442 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:49:26.442 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:26.442 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:49:26.642 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:26.839 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:26.840+0000 7f966761f780 -1 Falling back to public interface 2026-03-08T22:49:27.644 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:27.644 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:27.644 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:49:27.644 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:49:27.645 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:27.645 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:49:27.703 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:27.704+0000 7f966761f780 -1 osd.4 0 log_to_monitors true 2026-03-08T22:49:27.865 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:28.869 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:49:28.869 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:28.869 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:28.869 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:49:28.869 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:28.870 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:49:29.078 INFO:tasks.workunit.client.0.vm04.stdout:osd.4 up in weight 1 up_from 35 up_thru 36 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6834/399252542,v1:127.0.0.1:6835/399252542] [v2:127.0.0.1:6836/399252542,v1:127.0.0.1:6837/399252542] exists,up 1db77cd4-11ef-406a-80bb-affaa7b341e6 2026-03-08T22:49:29.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:49:29.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 5 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=5 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/5 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/5' 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/5/journal' 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:29.079 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:49:29.080 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:49:29.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:49:29.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:49:29.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:49:29.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:49:29.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:49:29.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:49:29.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:49:29.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:49:29.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:49:29.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:49:29.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:49:29.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:49:29.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/5 2026-03-08T22:49:29.081 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:49:29.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=a4d2b8bc-73db-4de7-accf-8ae82fffbe4e 2026-03-08T22:49:29.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd5 a4d2b8bc-73db-4de7-accf-8ae82fffbe4e' 2026-03-08T22:49:29.082 INFO:tasks.workunit.client.0.vm04.stdout:add osd5 a4d2b8bc-73db-4de7-accf-8ae82fffbe4e 2026-03-08T22:49:29.082 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:49:29.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQD5/K1pn1i4BRAACIzpWgIehvWQ0oXy+CBFhw== 2026-03-08T22:49:29.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQD5/K1pn1i4BRAACIzpWgIehvWQ0oXy+CBFhw=="}' 2026-03-08T22:49:29.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new a4d2b8bc-73db-4de7-accf-8ae82fffbe4e -i td/test-erasure-eio/5/new.json 2026-03-08T22:49:29.308 INFO:tasks.workunit.client.0.vm04.stdout:5 2026-03-08T22:49:29.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/5/new.json 2026-03-08T22:49:29.318 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 5 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/5 --osd-journal=td/test-erasure-eio/5/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQD5/K1pn1i4BRAACIzpWgIehvWQ0oXy+CBFhw== --osd-uuid a4d2b8bc-73db-4de7-accf-8ae82fffbe4e 2026-03-08T22:49:29.336 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:29.337+0000 7f3c97719780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:29.339 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:29.340+0000 7f3c97719780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:29.339 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:29.340+0000 7f3c97719780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:29.340 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:29.341+0000 7f3c97719780 -1 bdev(0x55fcc525bc00 td/test-erasure-eio/5/block) open stat got: (1) Operation not permitted 2026-03-08T22:49:29.340 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:29.341+0000 7f3c97719780 -1 bluestore(td/test-erasure-eio/5) _read_fsid unparsable uuid 2026-03-08T22:49:31.701 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/5/keyring 2026-03-08T22:49:31.701 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:49:31.702 INFO:tasks.workunit.client.0.vm04.stdout:adding osd5 key to auth repository 2026-03-08T22:49:31.702 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd5 key to auth repository 2026-03-08T22:49:31.702 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/5/keyring auth add osd.5 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:49:31.980 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.5 2026-03-08T22:49:31.980 INFO:tasks.workunit.client.0.vm04.stdout:start osd.5 2026-03-08T22:49:31.980 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 5 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/5 --osd-journal=td/test-erasure-eio/5/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:49:31.980 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:49:31.981 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:49:31.983 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:49:31.998 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:31.999+0000 7fd72720f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:32.005 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:32.006+0000 7fd72720f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:32.006 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:32.007+0000 7fd72720f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:32.192 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 5 2026-03-08T22:49:32.192 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:49:32.192 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=5 2026-03-08T22:49:32.192 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:49:32.192 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:49:32.192 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:32.192 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:49:32.192 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:49:32.192 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:32.192 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:49:32.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:33.086 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:33.087+0000 7fd72720f780 -1 Falling back to public interface 2026-03-08T22:49:33.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:33.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:33.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:49:33.395 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:49:33.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:33.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:49:33.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:33.966 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:33.967+0000 7fd72720f780 -1 osd.5 0 log_to_monitors true 2026-03-08T22:49:34.601 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:49:34.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:34.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:34.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:49:34.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:34.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:49:34.825 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:35.828 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:49:35.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:35.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:35.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:49:35.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:35.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:49:36.030 INFO:tasks.workunit.client.0.vm04.stdout:osd.5 up in weight 1 up_from 44 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6842/3955531449,v1:127.0.0.1:6843/3955531449] [v2:127.0.0.1:6844/3955531449,v1:127.0.0.1:6845/3955531449] exists,up a4d2b8bc-73db-4de7-accf-8ae82fffbe4e 2026-03-08T22:49:36.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:49:36.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:49:36.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:49:36.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:49:36.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 6 2026-03-08T22:49:36.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:49:36.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:49:36.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=6 2026-03-08T22:49:36.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:49:36.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/6 2026-03-08T22:49:36.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:49:36.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:49:36.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:49:36.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:49:36.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/6' 2026-03-08T22:49:36.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/6/journal' 2026-03-08T22:49:36.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:49:36.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:49:36.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:49:36.031 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:49:36.031 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:49:36.031 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:49:36.031 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:49:36.031 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:36.031 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:49:36.031 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:49:36.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:49:36.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:49:36.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:49:36.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:49:36.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:49:36.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:49:36.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:49:36.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:49:36.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:49:36.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:49:36.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:49:36.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:49:36.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/6 2026-03-08T22:49:36.033 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:49:36.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=e412477e-fcf9-4b63-a950-692010305db8 2026-03-08T22:49:36.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd6 e412477e-fcf9-4b63-a950-692010305db8' 2026-03-08T22:49:36.035 INFO:tasks.workunit.client.0.vm04.stdout:add osd6 e412477e-fcf9-4b63-a950-692010305db8 2026-03-08T22:49:36.036 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:49:36.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAA/a1pOsIRAxAAtWzqKVk37PnnbI72nJ7i8g== 2026-03-08T22:49:36.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAA/a1pOsIRAxAAtWzqKVk37PnnbI72nJ7i8g=="}' 2026-03-08T22:49:36.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new e412477e-fcf9-4b63-a950-692010305db8 -i td/test-erasure-eio/6/new.json 2026-03-08T22:49:36.255 INFO:tasks.workunit.client.0.vm04.stdout:6 2026-03-08T22:49:36.265 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/6/new.json 2026-03-08T22:49:36.266 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 6 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/6 --osd-journal=td/test-erasure-eio/6/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAA/a1pOsIRAxAAtWzqKVk37PnnbI72nJ7i8g== --osd-uuid e412477e-fcf9-4b63-a950-692010305db8 2026-03-08T22:49:36.284 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:36.285+0000 7f7c6d811780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:36.287 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:36.288+0000 7f7c6d811780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:36.288 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:36.289+0000 7f7c6d811780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:36.288 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:36.289+0000 7f7c6d811780 -1 bdev(0x5631d12b9c00 td/test-erasure-eio/6/block) open stat got: (1) Operation not permitted 2026-03-08T22:49:36.288 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:36.289+0000 7f7c6d811780 -1 bluestore(td/test-erasure-eio/6) _read_fsid unparsable uuid 2026-03-08T22:49:38.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/6/keyring 2026-03-08T22:49:38.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:49:38.932 INFO:tasks.workunit.client.0.vm04.stdout:adding osd6 key to auth repository 2026-03-08T22:49:38.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd6 key to auth repository 2026-03-08T22:49:38.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/6/keyring auth add osd.6 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:49:39.215 INFO:tasks.workunit.client.0.vm04.stdout:start osd.6 2026-03-08T22:49:39.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.6 2026-03-08T22:49:39.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 6 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/6 --osd-journal=td/test-erasure-eio/6/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:49:39.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:49:39.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:49:39.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:49:39.232 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:39.233+0000 7f234c695780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:39.240 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:39.241+0000 7f234c695780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:39.242 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:39.242+0000 7f234c695780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:39.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 6 2026-03-08T22:49:39.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:49:39.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=6 2026-03-08T22:49:39.422 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:49:39.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:49:39.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:49:39.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:39.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:49:39.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:39.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:49:39.632 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:39.796 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:39.797+0000 7f234c695780 -1 Falling back to public interface 2026-03-08T22:49:40.634 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:49:40.634 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:40.634 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:40.634 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:49:40.634 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:40.634 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:49:40.656 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:40.657+0000 7f234c695780 -1 osd.6 0 log_to_monitors true 2026-03-08T22:49:40.848 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:41.851 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:49:41.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:41.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:41.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:49:41.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:41.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:49:42.051 INFO:tasks.workunit.client.0.vm04.stdout:osd.6 up in weight 1 up_from 52 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6850/3219941898,v1:127.0.0.1:6851/3219941898] [v2:127.0.0.1:6852/3219941898,v1:127.0.0.1:6853/3219941898] exists,up e412477e-fcf9-4b63-a950-692010305db8 2026-03-08T22:49:42.051 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:49:42.051 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:49:42.051 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:49:42.051 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:49:42.051 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 7 2026-03-08T22:49:42.051 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:49:42.051 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:49:42.051 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=7 2026-03-08T22:49:42.051 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:49:42.051 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/7 2026-03-08T22:49:42.051 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:49:42.052 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:49:42.052 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:49:42.052 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:49:42.052 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/7' 2026-03-08T22:49:42.052 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/7/journal' 2026-03-08T22:49:42.052 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:49:42.052 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:49:42.052 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:49:42.052 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:49:42.052 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:49:42.052 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:49:42.052 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:49:42.052 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:42.052 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:49:42.052 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:49:42.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:49:42.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:49:42.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:49:42.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:49:42.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:49:42.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:49:42.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:49:42.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:49:42.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:49:42.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:49:42.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:49:42.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:49:42.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/7 2026-03-08T22:49:42.054 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:49:42.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=848fedf3-2994-40e8-ae3e-ecca095fb6bd 2026-03-08T22:49:42.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd7 848fedf3-2994-40e8-ae3e-ecca095fb6bd' 2026-03-08T22:49:42.055 INFO:tasks.workunit.client.0.vm04.stdout:add osd7 848fedf3-2994-40e8-ae3e-ecca095fb6bd 2026-03-08T22:49:42.055 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:49:42.067 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAG/a1pPxwbBBAAeX5aYeGIMFUZILOs0jkVXw== 2026-03-08T22:49:42.067 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAG/a1pPxwbBBAAeX5aYeGIMFUZILOs0jkVXw=="}' 2026-03-08T22:49:42.067 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 848fedf3-2994-40e8-ae3e-ecca095fb6bd -i td/test-erasure-eio/7/new.json 2026-03-08T22:49:42.290 INFO:tasks.workunit.client.0.vm04.stdout:7 2026-03-08T22:49:42.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/7/new.json 2026-03-08T22:49:42.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 7 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/7 --osd-journal=td/test-erasure-eio/7/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAG/a1pPxwbBBAAeX5aYeGIMFUZILOs0jkVXw== --osd-uuid 848fedf3-2994-40e8-ae3e-ecca095fb6bd 2026-03-08T22:49:42.318 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:42.319+0000 7f12d0eb1780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:42.319 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:42.321+0000 7f12d0eb1780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:42.320 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:42.321+0000 7f12d0eb1780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:42.321 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:42.322+0000 7f12d0eb1780 -1 bdev(0x55d361b29c00 td/test-erasure-eio/7/block) open stat got: (1) Operation not permitted 2026-03-08T22:49:42.321 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:42.322+0000 7f12d0eb1780 -1 bluestore(td/test-erasure-eio/7) _read_fsid unparsable uuid 2026-03-08T22:49:44.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/7/keyring 2026-03-08T22:49:44.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:49:44.942 INFO:tasks.workunit.client.0.vm04.stdout:adding osd7 key to auth repository 2026-03-08T22:49:44.942 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd7 key to auth repository 2026-03-08T22:49:44.942 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/7/keyring auth add osd.7 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:49:45.223 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.7 2026-03-08T22:49:45.223 INFO:tasks.workunit.client.0.vm04.stdout:start osd.7 2026-03-08T22:49:45.224 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 7 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/7 --osd-journal=td/test-erasure-eio/7/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:49:45.224 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:49:45.225 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:49:45.227 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:49:45.242 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:45.243+0000 7f9674f92780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:45.246 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:45.247+0000 7f9674f92780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:45.247 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:45.248+0000 7f9674f92780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:45.440 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 7 2026-03-08T22:49:45.440 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:49:45.440 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=7 2026-03-08T22:49:45.440 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:49:45.440 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:49:45.440 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:45.440 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:49:45.441 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:49:45.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:45.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.7 up' 2026-03-08T22:49:45.651 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:46.309 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:46.310+0000 7f9674f92780 -1 Falling back to public interface 2026-03-08T22:49:46.652 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:46.652 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:46.652 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:49:46.652 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:49:46.652 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:46.652 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.7 up' 2026-03-08T22:49:46.859 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:47.167 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:47.169+0000 7f9674f92780 -1 osd.7 0 log_to_monitors true 2026-03-08T22:49:47.862 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:49:47.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:47.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:47.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:49:47.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:47.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.7 up' 2026-03-08T22:49:48.080 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:49.082 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:49:49.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:49.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:49.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:49:49.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:49.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.7 up' 2026-03-08T22:49:49.285 INFO:tasks.workunit.client.0.vm04.stdout:osd.7 up in weight 1 up_from 62 up_thru 64 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6858/1076118138,v1:127.0.0.1:6859/1076118138] [v2:127.0.0.1:6860/1076118138,v1:127.0.0.1:6861/1076118138] exists,up 848fedf3-2994-40e8-ae3e-ecca095fb6bd 2026-03-08T22:49:49.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:49:49.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:49:49.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:49:49.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:49:49.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 8 2026-03-08T22:49:49.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:49:49.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:49:49.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=8 2026-03-08T22:49:49.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:49:49.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/8 2026-03-08T22:49:49.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:49:49.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:49:49.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:49:49.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:49:49.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/8' 2026-03-08T22:49:49.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/8/journal' 2026-03-08T22:49:49.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:49:49.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:49:49.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:49:49.287 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:49:49.287 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:49:49.287 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:49:49.287 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:49:49.287 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:49.287 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:49:49.287 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:49:49.287 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:49:49.287 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:49:49.287 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:49:49.287 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:49:49.287 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:49:49.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:49:49.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:49:49.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:49:49.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:49:49.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:49:49.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:49:49.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:49:49.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/8 2026-03-08T22:49:49.289 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:49:49.289 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=b44d8b3d-157e-4dea-a83a-57d2719092bb 2026-03-08T22:49:49.289 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd8 b44d8b3d-157e-4dea-a83a-57d2719092bb' 2026-03-08T22:49:49.289 INFO:tasks.workunit.client.0.vm04.stdout:add osd8 b44d8b3d-157e-4dea-a83a-57d2719092bb 2026-03-08T22:49:49.289 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:49:49.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAN/a1pXpEdEhAA8KB7ZmrRJ3S5FWX+QwekWg== 2026-03-08T22:49:49.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAN/a1pXpEdEhAA8KB7ZmrRJ3S5FWX+QwekWg=="}' 2026-03-08T22:49:49.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new b44d8b3d-157e-4dea-a83a-57d2719092bb -i td/test-erasure-eio/8/new.json 2026-03-08T22:49:49.515 INFO:tasks.workunit.client.0.vm04.stdout:8 2026-03-08T22:49:49.524 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/8/new.json 2026-03-08T22:49:49.525 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 8 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/8 --osd-journal=td/test-erasure-eio/8/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAN/a1pXpEdEhAA8KB7ZmrRJ3S5FWX+QwekWg== --osd-uuid b44d8b3d-157e-4dea-a83a-57d2719092bb 2026-03-08T22:49:49.543 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:49.544+0000 7fed70e16780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:49.544 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:49.545+0000 7fed70e16780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:49.545 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:49.546+0000 7fed70e16780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:49.545 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:49.547+0000 7fed70e16780 -1 bdev(0x559c8c685c00 td/test-erasure-eio/8/block) open stat got: (1) Operation not permitted 2026-03-08T22:49:49.545 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:49.547+0000 7fed70e16780 -1 bluestore(td/test-erasure-eio/8) _read_fsid unparsable uuid 2026-03-08T22:49:52.179 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/8/keyring 2026-03-08T22:49:52.179 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:49:52.180 INFO:tasks.workunit.client.0.vm04.stdout:adding osd8 key to auth repository 2026-03-08T22:49:52.180 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd8 key to auth repository 2026-03-08T22:49:52.180 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/8/keyring auth add osd.8 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:49:52.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.8 2026-03-08T22:49:52.456 INFO:tasks.workunit.client.0.vm04.stdout:start osd.8 2026-03-08T22:49:52.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 8 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/8 --osd-journal=td/test-erasure-eio/8/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:49:52.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:49:52.457 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:49:52.458 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:49:52.473 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:52.474+0000 7f3bc1ec6780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:52.481 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:52.482+0000 7f3bc1ec6780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:52.482 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:52.483+0000 7f3bc1ec6780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:52.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 8 2026-03-08T22:49:52.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:49:52.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=8 2026-03-08T22:49:52.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:49:52.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:49:52.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:52.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:49:52.663 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:49:52.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:52.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.8 up' 2026-03-08T22:49:52.866 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:53.549 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:53.550+0000 7f3bc1ec6780 -1 Falling back to public interface 2026-03-08T22:49:53.867 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:49:53.868 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:53.868 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:53.868 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:49:53.868 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:53.868 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.8 up' 2026-03-08T22:49:54.075 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:54.408 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:49:54.409+0000 7f3bc1ec6780 -1 osd.8 0 log_to_monitors true 2026-03-08T22:49:55.079 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:49:55.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:55.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:55.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:49:55.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:55.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.8 up' 2026-03-08T22:49:55.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:56.295 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:49:56.295 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:56.295 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:56.295 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:49:56.295 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:56.295 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.8 up' 2026-03-08T22:49:56.513 INFO:tasks.workunit.client.0.vm04.stdout:osd.8 up in weight 1 up_from 71 up_thru 73 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6866/2929997924,v1:127.0.0.1:6867/2929997924] [v2:127.0.0.1:6868/2929997924,v1:127.0.0.1:6869/2929997924] exists,up b44d8b3d-157e-4dea-a83a-57d2719092bb 2026-03-08T22:49:56.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:49:56.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:49:56.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:49:56.514 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: get_asok_path osd.0 2026-03-08T22:49:56.514 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:49:56.514 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:49:56.514 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:49:56.514 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:56.514 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:49:56.515 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.0.asok 2026-03-08T22:49:56.515 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: CEPH_ARGS= 2026-03-08T22:49:56.515 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.0.asok log flush 2026-03-08T22:49:56.562 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:49:56.566 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:56: setup_osds: grep 'load: jerasure.*lrc' td/test-erasure-eio/osd.0.log 2026-03-08T22:49:56.570 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:49:01.199+0000 7fd8eb1d2780 0 load: jerasure load: lrc 2026-03-08T22:49:56.570 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:423: TEST_ec_recovery_multiple_errors: local poolname=pool-jerasure 2026-03-08T22:49:56.570 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:424: TEST_ec_recovery_multiple_errors: create_erasure_coded_pool pool-jerasure 4 4 2026-03-08T22:49:56.570 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:67: create_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:49:56.570 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:68: create_erasure_coded_pool: shift 2026-03-08T22:49:56.570 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:69: create_erasure_coded_pool: local k=4 2026-03-08T22:49:56.570 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:70: create_erasure_coded_pool: shift 2026-03-08T22:49:56.570 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:71: create_erasure_coded_pool: local m=4 2026-03-08T22:49:56.570 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:72: create_erasure_coded_pool: shift 2026-03-08T22:49:56.570 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:74: create_erasure_coded_pool: ceph osd erasure-code-profile set myprofile plugin=jerasure k=4 m=4 crush-failure-domain=osd 2026-03-08T22:49:56.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:78: create_erasure_coded_pool: create_pool pool-jerasure 1 1 erasure myprofile 2026-03-08T22:49:56.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create pool-jerasure 1 1 erasure myprofile 2026-03-08T22:49:57.209 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' already exists 2026-03-08T22:49:57.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:49:58.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:80: create_erasure_coded_pool: wait_for_clean 2026-03-08T22:49:58.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:49:58.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:49:58.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:49:58.219 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:49:58.219 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:49:58.219 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:49:58.220 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:49:58.220 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:49:58.220 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:49:58.289 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:49:58.289 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:49:58.289 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:49:58.289 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:49:58.289 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:49:58.289 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:49:58.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:49:58.506 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:49:58.506 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:49:58.506 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:49:58.506 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:49:58.506 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:49:58.506 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:49:58.506 INFO:tasks.workunit.client.0.vm04.stderr:7 2026-03-08T22:49:58.506 INFO:tasks.workunit.client.0.vm04.stderr:8' 2026-03-08T22:49:58.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:49:58.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:58.506 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:49:58.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803789 2026-03-08T22:49:58.573 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803789 2026-03-08T22:49:58.573 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803789' 2026-03-08T22:49:58.573 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:58.573 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:49:58.642 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574860 2026-03-08T22:49:58.642 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574860 2026-03-08T22:49:58.642 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803789 1-55834574860' 2026-03-08T22:49:58.642 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:58.642 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:49:58.711 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378635 2026-03-08T22:49:58.711 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378635 2026-03-08T22:49:58.711 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803789 1-55834574860 2-81604378635' 2026-03-08T22:49:58.711 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:58.712 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:49:58.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964117002 2026-03-08T22:49:58.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964117002 2026-03-08T22:49:58.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803789 1-55834574860 2-81604378635 3-115964117002' 2026-03-08T22:49:58.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:58.780 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:49:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855368 2026-03-08T22:49:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855368 2026-03-08T22:49:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803789 1-55834574860 2-81604378635 3-115964117002 4-150323855368' 2026-03-08T22:49:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:58.845 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:49:58.914 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561031 2026-03-08T22:49:58.914 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561031 2026-03-08T22:49:58.914 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803789 1-55834574860 2-81604378635 3-115964117002 4-150323855368 5-188978561031' 2026-03-08T22:49:58.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:58.915 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T22:49:58.982 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=223338299398 2026-03-08T22:49:58.982 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 223338299398 2026-03-08T22:49:58.982 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803789 1-55834574860 2-81604378635 3-115964117002 4-150323855368 5-188978561031 6-223338299398' 2026-03-08T22:49:58.982 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:58.982 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T22:49:59.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=266287972357 2026-03-08T22:49:59.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 266287972357 2026-03-08T22:49:59.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803789 1-55834574860 2-81604378635 3-115964117002 4-150323855368 5-188978561031 6-223338299398 7-266287972357' 2026-03-08T22:49:59.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:59.049 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T22:49:59.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=304942678019 2026-03-08T22:49:59.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 304942678019 2026-03-08T22:49:59.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803789 1-55834574860 2-81604378635 3-115964117002 4-150323855368 5-188978561031 6-223338299398 7-266287972357 8-304942678019' 2026-03-08T22:49:59.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:49:59.117 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803789 2026-03-08T22:49:59.117 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:49:59.118 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:49:59.119 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803789 2026-03-08T22:49:59.119 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:49:59.119 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803789 2026-03-08T22:49:59.120 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803789' 2026-03-08T22:49:59.120 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803789 2026-03-08T22:49:59.120 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:49:59.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803787 -lt 25769803789 2026-03-08T22:49:59.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:50:00.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:50:00.328 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:50:00.539 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803790 -lt 25769803789 2026-03-08T22:50:00.539 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:00.540 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574860 2026-03-08T22:50:00.540 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:00.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:50:00.541 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574860 2026-03-08T22:50:00.541 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:00.542 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574860 2026-03-08T22:50:00.542 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574860' 2026-03-08T22:50:00.542 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 55834574860 2026-03-08T22:50:00.542 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:50:00.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574860 -lt 55834574860 2026-03-08T22:50:00.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:00.749 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378635 2026-03-08T22:50:00.749 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:00.749 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:50:00.750 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378635 2026-03-08T22:50:00.750 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:00.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378635 2026-03-08T22:50:00.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378635' 2026-03-08T22:50:00.751 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 81604378635 2026-03-08T22:50:00.751 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:50:00.954 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378635 -lt 81604378635 2026-03-08T22:50:00.954 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:00.954 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964117002 2026-03-08T22:50:00.955 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:00.955 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:50:00.956 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964117002 2026-03-08T22:50:00.956 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:00.957 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964117002 2026-03-08T22:50:00.957 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964117002' 2026-03-08T22:50:00.957 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 115964117002 2026-03-08T22:50:00.957 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:50:01.162 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964117002 -lt 115964117002 2026-03-08T22:50:01.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:01.163 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-150323855368 2026-03-08T22:50:01.163 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:01.164 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:50:01.164 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-150323855368 2026-03-08T22:50:01.164 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:01.165 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855368 2026-03-08T22:50:01.165 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 150323855368' 2026-03-08T22:50:01.165 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.4 seq 150323855368 2026-03-08T22:50:01.165 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:50:01.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855369 -lt 150323855368 2026-03-08T22:50:01.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:01.382 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-188978561031 2026-03-08T22:50:01.382 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:01.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:50:01.383 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-188978561031 2026-03-08T22:50:01.383 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:01.384 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561031 2026-03-08T22:50:01.384 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 188978561031' 2026-03-08T22:50:01.384 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.5 seq 188978561031 2026-03-08T22:50:01.384 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:50:01.590 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561032 -lt 188978561031 2026-03-08T22:50:01.590 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:01.591 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-223338299398 2026-03-08T22:50:01.591 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:01.592 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T22:50:01.592 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-223338299398 2026-03-08T22:50:01.592 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:01.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=223338299398 2026-03-08T22:50:01.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 223338299398' 2026-03-08T22:50:01.593 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.6 seq 223338299398 2026-03-08T22:50:01.593 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T22:50:01.798 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 223338299398 -lt 223338299398 2026-03-08T22:50:01.798 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:01.799 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-266287972357 2026-03-08T22:50:01.799 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:01.800 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T22:50:01.800 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-266287972357 2026-03-08T22:50:01.800 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:01.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=266287972357 2026-03-08T22:50:01.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 266287972357' 2026-03-08T22:50:01.801 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.7 seq 266287972357 2026-03-08T22:50:01.802 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T22:50:02.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 266287972357 -lt 266287972357 2026-03-08T22:50:02.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:02.007 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-304942678019 2026-03-08T22:50:02.008 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:02.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T22:50:02.009 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-304942678019 2026-03-08T22:50:02.009 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:02.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=304942678019 2026-03-08T22:50:02.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 304942678019' 2026-03-08T22:50:02.010 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.8 seq 304942678019 2026-03-08T22:50:02.011 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T22:50:02.224 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 304942678020 -lt 304942678019 2026-03-08T22:50:02.224 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:50:02.224 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:50:02.224 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:50:02.503 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:50:02.503 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:50:02.503 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:50:02.503 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:50:02.503 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:50:02.503 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:50:02.503 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:50:02.503 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:50:02.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:50:02.715 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:50:02.715 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:50:02.715 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:50:02.988 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:50:02.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:50:02.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:50:02.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:426: TEST_ec_recovery_multiple_errors: rados_put td/test-erasure-eio pool-jerasure myobject 2026-03-08T22:50:02.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T22:50:02.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T22:50:02.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=myobject 2026-03-08T22:50:02.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:50:02.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T22:50:02.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:50:02.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T22:50:02.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:50:02.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T22:50:02.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:50:02.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T22:50:02.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put myobject td/test-erasure-eio/ORIGINAL 2026-03-08T22:50:03.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:427: TEST_ec_recovery_multiple_errors: inject_eio ec data pool-jerasure myobject td/test-erasure-eio 0 2026-03-08T22:50:03.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=ec 2026-03-08T22:50:03.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T22:50:03.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=data 2026-03-08T22:50:03.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T22:50:03.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=pool-jerasure 2026-03-08T22:50:03.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T22:50:03.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=myobject 2026-03-08T22:50:03.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T22:50:03.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/test-erasure-eio 2026-03-08T22:50:03.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T22:50:03.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=0 2026-03-08T22:50:03.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T22:50:03.018 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds pool-jerasure myobject 2026-03-08T22:50:03.018 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:50:03.018 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=myobject 2026-03-08T22:50:03.018 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure myobject 2026-03-08T22:50:03.019 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:50:03.230 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:50:03.230 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:50:03.230 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:50:03.230 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:50:03.230 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:50:03.230 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:50:03.230 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:50:03.230 INFO:tasks.workunit.client.0.vm04.stderr:7' 2026-03-08T22:50:03.230 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 5 1 0 6 2 4 7 2026-03-08T22:50:03.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('3' '5' '1' '0' '6' '2' '4' '7') 2026-03-08T22:50:03.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T22:50:03.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=3 2026-03-08T22:50:03.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' ec '!=' ec ']' 2026-03-08T22:50:03.230 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/test-erasure-eio/3/type 2026-03-08T22:50:03.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T22:50:03.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 3 bluestore_debug_inject_read_err true 2026-03-08T22:50:03.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T22:50:03.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=3 2026-03-08T22:50:03.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T22:50:03.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T22:50:03.232 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T22:50:03.232 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.3 2026-03-08T22:50:03.232 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.3 2026-03-08T22:50:03.232 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.3 ']' 2026-03-08T22:50:03.232 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:50:03.233 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:03.233 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:03.233 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.3.asok 2026-03-08T22:50:03.233 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.102080/ceph-osd.3.asok config set bluestore_debug_inject_read_err true 2026-03-08T22:50:03.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T22:50:03.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T22:50:03.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T22:50:03.286 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.3 2026-03-08T22:50:03.286 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.3 2026-03-08T22:50:03.286 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.3 ']' 2026-03-08T22:50:03.286 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:50:03.287 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:03.287 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:03.287 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.3.asok 2026-03-08T22:50:03.287 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T22:50:03.287 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.3.asok injectdataerr pool-jerasure myobject 0 2026-03-08T22:50:03.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:430: TEST_ec_recovery_multiple_errors: inject_eio ec data pool-jerasure myobject td/test-erasure-eio 3 2026-03-08T22:50:03.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=ec 2026-03-08T22:50:03.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T22:50:03.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=data 2026-03-08T22:50:03.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T22:50:03.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=pool-jerasure 2026-03-08T22:50:03.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T22:50:03.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=myobject 2026-03-08T22:50:03.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T22:50:03.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/test-erasure-eio 2026-03-08T22:50:03.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T22:50:03.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=3 2026-03-08T22:50:03.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T22:50:03.339 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds pool-jerasure myobject 2026-03-08T22:50:03.339 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:50:03.339 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=myobject 2026-03-08T22:50:03.339 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure myobject 2026-03-08T22:50:03.339 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:50:03.545 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:50:03.545 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:50:03.545 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:50:03.545 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:50:03.545 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:50:03.545 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:50:03.545 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:50:03.545 INFO:tasks.workunit.client.0.vm04.stderr:7' 2026-03-08T22:50:03.545 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 5 1 0 6 2 4 7 2026-03-08T22:50:03.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('3' '5' '1' '0' '6' '2' '4' '7') 2026-03-08T22:50:03.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T22:50:03.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=0 2026-03-08T22:50:03.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' ec '!=' ec ']' 2026-03-08T22:50:03.545 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/test-erasure-eio/0/type 2026-03-08T22:50:03.546 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T22:50:03.546 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 0 bluestore_debug_inject_read_err true 2026-03-08T22:50:03.546 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T22:50:03.546 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=0 2026-03-08T22:50:03.546 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T22:50:03.546 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T22:50:03.547 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T22:50:03.547 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.0 2026-03-08T22:50:03.547 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:50:03.547 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:50:03.548 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:50:03.548 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:03.548 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:03.548 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.0.asok 2026-03-08T22:50:03.548 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.102080/ceph-osd.0.asok config set bluestore_debug_inject_read_err true 2026-03-08T22:50:03.601 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T22:50:03.601 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T22:50:03.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T22:50:03.602 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.0 2026-03-08T22:50:03.602 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:50:03.602 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:50:03.602 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:50:03.602 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:03.602 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:03.603 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.0.asok 2026-03-08T22:50:03.603 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T22:50:03.603 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.0.asok injectdataerr pool-jerasure myobject 3 2026-03-08T22:50:03.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:431: TEST_ec_recovery_multiple_errors: inject_eio ec data pool-jerasure myobject td/test-erasure-eio 4 2026-03-08T22:50:03.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=ec 2026-03-08T22:50:03.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T22:50:03.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=data 2026-03-08T22:50:03.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T22:50:03.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=pool-jerasure 2026-03-08T22:50:03.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T22:50:03.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=myobject 2026-03-08T22:50:03.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T22:50:03.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/test-erasure-eio 2026-03-08T22:50:03.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T22:50:03.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=4 2026-03-08T22:50:03.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T22:50:03.655 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds pool-jerasure myobject 2026-03-08T22:50:03.655 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:50:03.655 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=myobject 2026-03-08T22:50:03.655 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure myobject 2026-03-08T22:50:03.655 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:50:03.863 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:50:03.863 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:50:03.863 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:50:03.863 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:50:03.863 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:50:03.863 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:50:03.863 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:50:03.863 INFO:tasks.workunit.client.0.vm04.stderr:7' 2026-03-08T22:50:03.863 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 5 1 0 6 2 4 7 2026-03-08T22:50:03.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('3' '5' '1' '0' '6' '2' '4' '7') 2026-03-08T22:50:03.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T22:50:03.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=6 2026-03-08T22:50:03.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' ec '!=' ec ']' 2026-03-08T22:50:03.863 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/test-erasure-eio/6/type 2026-03-08T22:50:03.864 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T22:50:03.864 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 6 bluestore_debug_inject_read_err true 2026-03-08T22:50:03.864 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T22:50:03.864 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=6 2026-03-08T22:50:03.864 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T22:50:03.864 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T22:50:03.865 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T22:50:03.865 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.6 2026-03-08T22:50:03.865 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.6 2026-03-08T22:50:03.865 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.6 ']' 2026-03-08T22:50:03.865 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:50:03.865 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:03.865 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:03.866 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.6.asok 2026-03-08T22:50:03.866 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.102080/ceph-osd.6.asok config set bluestore_debug_inject_read_err true 2026-03-08T22:50:03.918 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T22:50:03.918 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T22:50:03.918 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T22:50:03.918 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.6 2026-03-08T22:50:03.918 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.6 2026-03-08T22:50:03.918 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.6 ']' 2026-03-08T22:50:03.919 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:50:03.919 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:03.919 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:03.919 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.6.asok 2026-03-08T22:50:03.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T22:50:03.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.6.asok injectdataerr pool-jerasure myobject 4 2026-03-08T22:50:03.976 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:433: TEST_ec_recovery_multiple_errors: get_osds pool-jerasure myobject 2026-03-08T22:50:03.976 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:50:03.976 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=myobject 2026-03-08T22:50:03.976 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure myobject 2026-03-08T22:50:03.977 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:50:04.188 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:50:04.188 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:50:04.188 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:50:04.188 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:50:04.188 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:50:04.188 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:50:04.188 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:50:04.188 INFO:tasks.workunit.client.0.vm04.stderr:7' 2026-03-08T22:50:04.188 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 5 1 0 6 2 4 7 2026-03-08T22:50:04.188 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:433: TEST_ec_recovery_multiple_errors: initial_osds=('3' '5' '1' '0' '6' '2' '4' '7') 2026-03-08T22:50:04.188 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:433: TEST_ec_recovery_multiple_errors: local -a initial_osds 2026-03-08T22:50:04.188 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:434: TEST_ec_recovery_multiple_errors: local last_osd=7 2026-03-08T22:50:04.188 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:436: TEST_ec_recovery_multiple_errors: kill_daemons td/test-erasure-eio TERM osd.7 2026-03-08T22:50:04.189 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:50:04.189 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:50:04.189 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:50:04.189 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:50:04.189 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:50:04.293 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:50:04.293 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:437: TEST_ec_recovery_multiple_errors: ceph osd down 7 2026-03-08T22:50:04.509 INFO:tasks.workunit.client.0.vm04.stderr:osd.7 is already down. 2026-03-08T22:50:04.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:438: TEST_ec_recovery_multiple_errors: ceph osd out 7 2026-03-08T22:50:04.756 INFO:tasks.workunit.client.0.vm04.stderr:osd.7 is already out. 2026-03-08T22:50:04.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:441: TEST_ec_recovery_multiple_errors: wait_for_clean 2026-03-08T22:50:04.766 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:50:04.766 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:50:04.766 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:50:04.766 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:50:04.766 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:50:04.766 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:50:04.766 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:50:04.766 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:50:04.766 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:50:04.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:50:04.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:50:04.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:50:04.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:50:04.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:50:04.845 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:50:05.051 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:50:05.051 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:50:05.051 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:50:05.051 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:50:05.051 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:50:05.051 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:50:05.051 INFO:tasks.workunit.client.0.vm04.stderr:6 2026-03-08T22:50:05.051 INFO:tasks.workunit.client.0.vm04.stderr:7 2026-03-08T22:50:05.051 INFO:tasks.workunit.client.0.vm04.stderr:8' 2026-03-08T22:50:05.051 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:50:05.051 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:05.052 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:50:05.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803793 2026-03-08T22:50:05.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803793 2026-03-08T22:50:05.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803793' 2026-03-08T22:50:05.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:05.123 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:50:05.191 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574863 2026-03-08T22:50:05.191 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574863 2026-03-08T22:50:05.191 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803793 1-55834574863' 2026-03-08T22:50:05.191 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:05.191 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:50:05.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378638 2026-03-08T22:50:05.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378638 2026-03-08T22:50:05.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803793 1-55834574863 2-81604378638' 2026-03-08T22:50:05.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:05.258 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:50:05.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964117005 2026-03-08T22:50:05.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964117005 2026-03-08T22:50:05.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803793 1-55834574863 2-81604378638 3-115964117005' 2026-03-08T22:50:05.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:05.339 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:50:05.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855372 2026-03-08T22:50:05.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855372 2026-03-08T22:50:05.408 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803793 1-55834574863 2-81604378638 3-115964117005 4-150323855372' 2026-03-08T22:50:05.408 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:05.408 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:50:05.479 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561035 2026-03-08T22:50:05.479 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561035 2026-03-08T22:50:05.479 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803793 1-55834574863 2-81604378638 3-115964117005 4-150323855372 5-188978561035' 2026-03-08T22:50:05.479 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:05.479 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T22:50:05.546 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=223338299401 2026-03-08T22:50:05.546 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 223338299401 2026-03-08T22:50:05.546 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803793 1-55834574863 2-81604378638 3-115964117005 4-150323855372 5-188978561035 6-223338299401' 2026-03-08T22:50:05.546 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:05.547 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T22:50:05.596 INFO:tasks.workunit.client.0.vm04.stderr:Error ENXIO: problem getting command descriptions from osd.7 2026-03-08T22:50:05.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq= 2026-03-08T22:50:05.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z '' 2026-03-08T22:50:05.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2268: flush_pg_stats: continue 2026-03-08T22:50:05.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:05.599 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T22:50:05.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=304942678023 2026-03-08T22:50:05.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 304942678023 2026-03-08T22:50:05.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803793 1-55834574863 2-81604378638 3-115964117005 4-150323855372 5-188978561035 6-223338299401 8-304942678023' 2026-03-08T22:50:05.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:05.666 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803793 2026-03-08T22:50:05.666 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:05.667 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:50:05.667 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803793 2026-03-08T22:50:05.667 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:05.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803793 2026-03-08T22:50:05.668 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803793 2026-03-08T22:50:05.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803793' 2026-03-08T22:50:05.668 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:50:05.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803790 -lt 25769803793 2026-03-08T22:50:05.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:50:06.873 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:50:06.873 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:50:07.085 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803793 -lt 25769803793 2026-03-08T22:50:07.085 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:07.085 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574863 2026-03-08T22:50:07.085 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:07.086 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:50:07.087 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574863 2026-03-08T22:50:07.087 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:07.088 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574863 2026-03-08T22:50:07.088 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 55834574863 2026-03-08T22:50:07.088 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574863' 2026-03-08T22:50:07.088 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:50:07.297 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574864 -lt 55834574863 2026-03-08T22:50:07.297 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:07.298 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378638 2026-03-08T22:50:07.298 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:07.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:50:07.300 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378638 2026-03-08T22:50:07.300 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:07.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378638 2026-03-08T22:50:07.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378638' 2026-03-08T22:50:07.301 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 81604378638 2026-03-08T22:50:07.301 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:50:07.501 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378639 -lt 81604378638 2026-03-08T22:50:07.501 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:07.502 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964117005 2026-03-08T22:50:07.502 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:07.503 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:50:07.503 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964117005 2026-03-08T22:50:07.503 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:07.504 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964117005 2026-03-08T22:50:07.504 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 115964117005 2026-03-08T22:50:07.504 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964117005' 2026-03-08T22:50:07.504 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:50:07.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964117005 -lt 115964117005 2026-03-08T22:50:07.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:07.707 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-150323855372 2026-03-08T22:50:07.707 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:07.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:50:07.709 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-150323855372 2026-03-08T22:50:07.709 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:07.710 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855372 2026-03-08T22:50:07.710 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.4 seq 150323855372 2026-03-08T22:50:07.710 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 150323855372' 2026-03-08T22:50:07.710 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:50:07.911 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855372 -lt 150323855372 2026-03-08T22:50:07.911 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:07.911 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-188978561035 2026-03-08T22:50:07.911 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:07.912 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:50:07.912 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-188978561035 2026-03-08T22:50:07.913 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:07.913 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561035 2026-03-08T22:50:07.913 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.5 seq 188978561035 2026-03-08T22:50:07.913 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 188978561035' 2026-03-08T22:50:07.913 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:50:08.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561035 -lt 188978561035 2026-03-08T22:50:08.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:08.115 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-223338299401 2026-03-08T22:50:08.115 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:08.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T22:50:08.116 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-223338299401 2026-03-08T22:50:08.116 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:08.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=223338299401 2026-03-08T22:50:08.117 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.6 seq 223338299401 2026-03-08T22:50:08.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 223338299401' 2026-03-08T22:50:08.117 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T22:50:08.322 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 223338299402 -lt 223338299401 2026-03-08T22:50:08.322 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:08.322 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-304942678023 2026-03-08T22:50:08.322 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:08.323 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T22:50:08.324 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-304942678023 2026-03-08T22:50:08.324 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:08.324 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=304942678023 2026-03-08T22:50:08.325 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.8 seq 304942678023 2026-03-08T22:50:08.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 304942678023' 2026-03-08T22:50:08.325 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T22:50:08.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 304942678023 -lt 304942678023 2026-03-08T22:50:08.527 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:50:08.527 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:50:08.527 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:50:08.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:50:08.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:50:08.803 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:50:08.803 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:50:08.803 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:50:08.803 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:50:08.803 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:50:08.803 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:50:09.001 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:50:09.001 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:50:09.001 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:50:09.001 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:50:09.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:50:09.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:50:09.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:50:09.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:443: TEST_ec_recovery_multiple_errors: rados_get td/test-erasure-eio pool-jerasure myobject 2026-03-08T22:50:09.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:50:09.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:50:09.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=myobject 2026-03-08T22:50:09.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=ok 2026-03-08T22:50:09.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' ok = fail ']' 2026-03-08T22:50:09.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:120: rados_get: rados --pool pool-jerasure get myobject td/test-erasure-eio/COPY 2026-03-08T22:50:09.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:121: rados_get: diff td/test-erasure-eio/ORIGINAL td/test-erasure-eio/COPY 2026-03-08T22:50:09.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:122: rados_get: rm td/test-erasure-eio/COPY 2026-03-08T22:50:09.303 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:445: TEST_ec_recovery_multiple_errors: delete_erasure_coded_pool pool-jerasure 2026-03-08T22:50:09.303 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:84: delete_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:50:09.303 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:85: delete_erasure_coded_pool: ceph osd pool delete pool-jerasure pool-jerasure --yes-i-really-really-mean-it 2026-03-08T22:50:09.559 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' does not exist 2026-03-08T22:50:09.568 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:86: delete_erasure_coded_pool: ceph osd erasure-code-profile rm myprofile 2026-03-08T22:50:09.874 INFO:tasks.workunit.client.0.vm04.stderr:erasure-code-profile myprofile does not exist 2026-03-08T22:50:09.883 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:42: run: teardown td/test-erasure-eio 2026-03-08T22:50:09.883 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:50:09.883 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:50:09.883 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:50:09.883 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:50:09.883 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:50:09.883 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:50:09.883 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:50:09.883 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:50:10.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:50:10.018 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:50:10.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:50:10.019 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:50:10.020 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:50:10.020 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:50:10.020 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:50:10.020 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:50:10.020 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:50:10.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:50:10.021 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:50:10.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:50:10.022 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:50:10.022 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:50:10.057 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:50:10.057 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:10.057 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:10.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:50:10.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:50:10.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:50:10.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:32: run: for func in $funcs 2026-03-08T22:50:10.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:33: run: setup td/test-erasure-eio 2026-03-08T22:50:10.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-eio 2026-03-08T22:50:10.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-eio 2026-03-08T22:50:10.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:50:10.059 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:50:10.059 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:50:10.059 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:50:10.059 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:50:10.059 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:50:10.059 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:50:10.059 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:50:10.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:50:10.061 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:50:10.061 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:50:10.061 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:50:10.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:50:10.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:50:10.062 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:50:10.063 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:50:10.063 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:50:10.063 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:50:10.064 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:50:10.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:50:10.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:50:10.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:50:10.066 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:50:10.066 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:10.066 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:10.066 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:50:10.067 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:50:10.067 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:50:10.067 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-eio 2026-03-08T22:50:10.068 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:50:10.068 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:10.068 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:10.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.102080 2026-03-08T22:50:10.069 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:50:10.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:50:10.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:50:10.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-eio 1' TERM HUP INT 2026-03-08T22:50:10.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:34: run: run_mon td/test-erasure-eio a 2026-03-08T22:50:10.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-eio 2026-03-08T22:50:10.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:50:10.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:50:10.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:50:10.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-eio/a 2026-03-08T22:50:10.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-eio/a --run-dir=td/test-erasure-eio 2026-03-08T22:50:10.097 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:50:10.097 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:50:10.097 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:50:10.098 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:50:10.098 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:10.098 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:10.098 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:50:10.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-eio/a '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-eio/log --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:50:10.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:50:10.125 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:50:10.125 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:50:10.125 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:50:10.125 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:50:10.125 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:50:10.125 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:50:10.125 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:50:10.125 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:50:10.127 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:50:10.127 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:10.127 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:10.127 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:50:10.127 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:50:10.127 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get fsid 2026-03-08T22:50:10.173 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:50:10.173 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:50:10.173 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:50:10.173 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:50:10.173 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:50:10.173 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:50:10.173 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:50:10.173 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:50:10.174 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:50:10.174 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:10.174 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:10.174 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:50:10.175 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:50:10.175 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get mon_host 2026-03-08T22:50:10.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:35: run: run_mgr td/test-erasure-eio x 2026-03-08T22:50:10.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-eio 2026-03-08T22:50:10.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:50:10.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:50:10.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:50:10.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-eio/x 2026-03-08T22:50:10.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:50:10.320 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:50:10.320 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:50:10.320 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:50:10.320 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:50:10.320 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:10.320 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:10.320 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:50:10.321 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:50:10.322 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-eio/x '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:50:10.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:36: run: create_pool rbd 4 2026-03-08T22:50:10.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:50:10.447 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd' already exists 2026-03-08T22:50:10.455 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:50:11.457 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: get_asok_path mon.a 2026-03-08T22:50:11.457 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:50:11.457 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:50:11.457 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:50:11.457 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:11.457 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:11.457 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:50:11.457 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: CEPH_ARGS= 2026-03-08T22:50:11.457 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-mon.a.asok log flush 2026-03-08T22:50:11.498 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:50:11.503 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:40: run: grep 'load: jerasure.*lrc' td/test-erasure-eio/mon.a.log 2026-03-08T22:50:11.504 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:50:10.118+0000 7fd2fec0ed80 0 load: jerasure load: lrc 2026-03-08T22:50:11.504 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:41: run: TEST_ec_recovery_multiple_objects td/test-erasure-eio 2026-03-08T22:50:11.504 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:451: TEST_ec_recovery_multiple_objects: local dir=td/test-erasure-eio 2026-03-08T22:50:11.504 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:452: TEST_ec_recovery_multiple_objects: local objname=myobject 2026-03-08T22:50:11.504 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:454: TEST_ec_recovery_multiple_objects: ORIG_ARGS='--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:50:11.504 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:455: TEST_ec_recovery_multiple_objects: CEPH_ARGS+=' --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 ' 2026-03-08T22:50:11.504 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:456: TEST_ec_recovery_multiple_objects: setup_osds 7 2026-03-08T22:50:11.504 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:47: setup_osds: local count=7 2026-03-08T22:50:11.504 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:48: setup_osds: shift 2026-03-08T22:50:11.505 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: expr 7 - 1 2026-03-08T22:50:11.505 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: seq 0 6 2026-03-08T22:50:11.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:50:11.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 0 2026-03-08T22:50:11.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:50:11.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:50:11.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:50:11.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:50:11.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/0 2026-03-08T22:50:11.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 ' 2026-03-08T22:50:11.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:50:11.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:50:11.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:50:11.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/0' 2026-03-08T22:50:11.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/0/journal' 2026-03-08T22:50:11.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:50:11.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:50:11.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:50:11.507 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:50:11.507 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:50:11.507 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:50:11.507 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:50:11.507 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:11.507 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:11.507 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:50:11.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:50:11.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:50:11.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:50:11.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:50:11.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:50:11.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:50:11.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:50:11.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:50:11.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:50:11.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:50:11.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:50:11.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:50:11.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/0 2026-03-08T22:50:11.508 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:50:11.508 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=0c1c8855-aaa8-47e8-b262-001af728a00d 2026-03-08T22:50:11.508 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 0c1c8855-aaa8-47e8-b262-001af728a00d 2026-03-08T22:50:11.509 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 0c1c8855-aaa8-47e8-b262-001af728a00d' 2026-03-08T22:50:11.509 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:50:11.522 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAj/a1pZZQ/HxAA/enBV09Ss8IfK5LeJQ7oIA== 2026-03-08T22:50:11.523 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAj/a1pZZQ/HxAA/enBV09Ss8IfK5LeJQ7oIA=="}' 2026-03-08T22:50:11.523 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 0c1c8855-aaa8-47e8-b262-001af728a00d -i td/test-erasure-eio/0/new.json 2026-03-08T22:50:11.644 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:50:11.652 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/0/new.json 2026-03-08T22:50:11.652 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAj/a1pZZQ/HxAA/enBV09Ss8IfK5LeJQ7oIA== --osd-uuid 0c1c8855-aaa8-47e8-b262-001af728a00d 2026-03-08T22:50:11.675 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:11.675+0000 7f2104b53780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:11.676 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:11.677+0000 7f2104b53780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:11.678 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:11.679+0000 7f2104b53780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:11.679 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:11.680+0000 7f2104b53780 -1 bdev(0x555fe9214800 td/test-erasure-eio/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:50:11.679 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:11.680+0000 7f2104b53780 -1 bluestore(td/test-erasure-eio/0) _read_fsid unparsable uuid 2026-03-08T22:50:13.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/0/keyring 2026-03-08T22:50:13.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:50:13.794 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:50:13.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:50:13.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:50:14.062 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:50:14.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:50:14.063 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:50:14.064 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:50:14.064 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:50:14.064 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:50:14.083 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:14.082+0000 7f8a276c4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:14.083 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:14.084+0000 7f8a276c4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:14.085 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:14.085+0000 7f8a276c4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:14.266 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:50:14.266 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:50:14.266 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:50:14.266 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:50:14.266 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:50:14.266 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:14.266 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:50:14.266 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:50:14.266 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:14.266 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:50:14.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:14.655 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:14.656+0000 7f8a276c4780 -1 Falling back to public interface 2026-03-08T22:50:15.476 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:50:15.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:15.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:15.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:50:15.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:15.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:50:15.513 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:15.514+0000 7f8a276c4780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:50:15.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:16.456 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:16.457+0000 7f8a22e63640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:50:16.701 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:50:16.701 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:16.701 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:16.701 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:50:16.701 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:16.701 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:50:16.905 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 6 up_thru 6 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/2304973664,v1:127.0.0.1:6803/2304973664] [v2:127.0.0.1:6804/2304973664,v1:127.0.0.1:6805/2304973664] exists,up 0c1c8855-aaa8-47e8-b262-001af728a00d 2026-03-08T22:50:16.905 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:50:16.905 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:50:16.905 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:50:16.905 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:50:16.905 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 1 2026-03-08T22:50:16.905 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:50:16.905 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:50:16.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:50:16.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:50:16.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T22:50:16.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 ' 2026-03-08T22:50:16.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:50:16.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:50:16.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:50:16.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T22:50:16.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T22:50:16.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:50:16.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:50:16.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:50:16.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:50:16.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:50:16.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:50:16.906 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:50:16.906 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:16.906 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:16.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:50:16.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:50:16.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:50:16.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:50:16.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:50:16.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:50:16.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:50:16.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:50:16.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:50:16.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:50:16.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:50:16.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:50:16.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:50:16.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T22:50:16.908 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:50:16.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=e5f3b3a5-768a-44b4-a8dd-4df9a659291e 2026-03-08T22:50:16.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 e5f3b3a5-768a-44b4-a8dd-4df9a659291e' 2026-03-08T22:50:16.909 INFO:tasks.workunit.client.0.vm04.stdout:add osd1 e5f3b3a5-768a-44b4-a8dd-4df9a659291e 2026-03-08T22:50:16.909 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:50:16.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAo/a1pbxwfNxAAzr+twZx71mrUTjPKBAI3HA== 2026-03-08T22:50:16.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAo/a1pbxwfNxAAzr+twZx71mrUTjPKBAI3HA=="}' 2026-03-08T22:50:16.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new e5f3b3a5-768a-44b4-a8dd-4df9a659291e -i td/test-erasure-eio/1/new.json 2026-03-08T22:50:17.129 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:50:17.139 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/1/new.json 2026-03-08T22:50:17.140 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAo/a1pbxwfNxAAzr+twZx71mrUTjPKBAI3HA== --osd-uuid e5f3b3a5-768a-44b4-a8dd-4df9a659291e 2026-03-08T22:50:17.160 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:17.161+0000 7f5a23023780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:17.162 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:17.163+0000 7f5a23023780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:17.163 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:17.164+0000 7f5a23023780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:17.164 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:17.165+0000 7f5a23023780 -1 bdev(0x5643cb8bfc00 td/test-erasure-eio/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:50:17.164 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:17.165+0000 7f5a23023780 -1 bluestore(td/test-erasure-eio/1) _read_fsid unparsable uuid 2026-03-08T22:50:19.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/1/keyring 2026-03-08T22:50:19.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:50:19.795 INFO:tasks.workunit.client.0.vm04.stdout:adding osd1 key to auth repository 2026-03-08T22:50:19.795 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:50:19.795 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:50:20.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:50:20.069 INFO:tasks.workunit.client.0.vm04.stdout:start osd.1 2026-03-08T22:50:20.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:50:20.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:50:20.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:50:20.072 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:50:20.089 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:20.090+0000 7f0bf28c8780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:20.091 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:20.092+0000 7f0bf28c8780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:20.092 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:20.093+0000 7f0bf28c8780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:20.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:50:20.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:50:20.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:50:20.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:50:20.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:50:20.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:20.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:50:20.271 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:50:20.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:20.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:50:20.474 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:20.653 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:20.654+0000 7f0bf28c8780 -1 Falling back to public interface 2026-03-08T22:50:21.476 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:50:21.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:21.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:21.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:50:21.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:21.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:50:21.526 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:21.527+0000 7f0bf28c8780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:50:21.694 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:22.697 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:50:22.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:22.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:22.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:50:22.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:22.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:50:22.906 INFO:tasks.workunit.client.0.vm04.stdout:osd.1 up in weight 1 up_from 13 up_thru 13 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/2005616877,v1:127.0.0.1:6811/2005616877] [v2:127.0.0.1:6812/2005616877,v1:127.0.0.1:6813/2005616877] exists,up e5f3b3a5-768a-44b4-a8dd-4df9a659291e 2026-03-08T22:50:22.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:50:22.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 2 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 ' 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:22.907 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:50:22.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:50:22.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:50:22.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:50:22.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:50:22.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:50:22.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:50:22.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:50:22.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:50:22.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:50:22.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:50:22.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:50:22.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:50:22.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T22:50:22.909 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:50:22.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=bf6dd083-ca4e-4eb2-aeb3-fb5a4bff5002 2026-03-08T22:50:22.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 bf6dd083-ca4e-4eb2-aeb3-fb5a4bff5002' 2026-03-08T22:50:22.910 INFO:tasks.workunit.client.0.vm04.stdout:add osd2 bf6dd083-ca4e-4eb2-aeb3-fb5a4bff5002 2026-03-08T22:50:22.910 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:50:22.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAu/a1pNDwpNxAAA9cQQVBv1rR0H4JHlT7TyQ== 2026-03-08T22:50:22.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAu/a1pNDwpNxAAA9cQQVBv1rR0H4JHlT7TyQ=="}' 2026-03-08T22:50:22.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new bf6dd083-ca4e-4eb2-aeb3-fb5a4bff5002 -i td/test-erasure-eio/2/new.json 2026-03-08T22:50:23.134 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:50:23.144 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/2/new.json 2026-03-08T22:50:23.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAu/a1pNDwpNxAAA9cQQVBv1rR0H4JHlT7TyQ== --osd-uuid bf6dd083-ca4e-4eb2-aeb3-fb5a4bff5002 2026-03-08T22:50:23.165 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:23.166+0000 7f041cc22780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:23.167 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:23.168+0000 7f041cc22780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:23.168 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:23.169+0000 7f041cc22780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:23.168 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:23.169+0000 7f041cc22780 -1 bdev(0x558dd38d1c00 td/test-erasure-eio/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:50:23.168 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:23.169+0000 7f041cc22780 -1 bluestore(td/test-erasure-eio/2) _read_fsid unparsable uuid 2026-03-08T22:50:25.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/2/keyring 2026-03-08T22:50:25.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:50:25.805 INFO:tasks.workunit.client.0.vm04.stdout:adding osd2 key to auth repository 2026-03-08T22:50:25.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:50:25.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:50:26.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:50:26.082 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T22:50:26.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:50:26.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:50:26.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:50:26.085 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:50:26.102 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:26.103+0000 7f4d2ae09780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:26.105 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:26.106+0000 7f4d2ae09780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:26.107 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:26.107+0000 7f4d2ae09780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:26.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:50:26.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:50:26.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:50:26.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:50:26.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:50:26.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:26.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:50:26.290 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:50:26.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:26.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:50:26.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:26.663 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:26.663+0000 7f4d2ae09780 -1 Falling back to public interface 2026-03-08T22:50:27.494 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:50:27.494 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:27.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:27.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:50:27.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:27.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:50:27.693 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:28.018 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:28.019+0000 7f4d2ae09780 -1 osd.2 0 log_to_monitors true 2026-03-08T22:50:28.696 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:50:28.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:28.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:28.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:50:28.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:28.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:50:28.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:28.954 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:28.955+0000 7f4d265a8640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T22:50:29.938 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:50:29.938 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:29.938 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:29.938 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:50:29.938 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:29.938 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:50:30.146 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up in weight 1 up_from 18 up_thru 20 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/2703782229,v1:127.0.0.1:6819/2703782229] [v2:127.0.0.1:6820/2703782229,v1:127.0.0.1:6821/2703782229] exists,up bf6dd083-ca4e-4eb2-aeb3-fb5a4bff5002 2026-03-08T22:50:30.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:50:30.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:50:30.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:50:30.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:50:30.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 3 2026-03-08T22:50:30.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:50:30.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:50:30.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T22:50:30.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:50:30.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/3 2026-03-08T22:50:30.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 ' 2026-03-08T22:50:30.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:50:30.147 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:50:30.147 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:50:30.147 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/3' 2026-03-08T22:50:30.147 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/3/journal' 2026-03-08T22:50:30.147 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:50:30.147 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:50:30.147 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:50:30.147 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:50:30.147 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:50:30.147 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:50:30.147 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:50:30.147 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:30.147 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:30.147 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:50:30.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:50:30.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:50:30.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:50:30.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:50:30.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:50:30.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:50:30.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:50:30.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:50:30.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:50:30.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:50:30.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:50:30.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:50:30.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/3 2026-03-08T22:50:30.149 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:50:30.150 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=db4c7aa6-2c2f-4b50-9ec8-0990a64fa9ca 2026-03-08T22:50:30.150 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 db4c7aa6-2c2f-4b50-9ec8-0990a64fa9ca' 2026-03-08T22:50:30.150 INFO:tasks.workunit.client.0.vm04.stdout:add osd3 db4c7aa6-2c2f-4b50-9ec8-0990a64fa9ca 2026-03-08T22:50:30.150 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:50:30.164 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQA2/a1p3MDfCRAAYxDtD30I6yHbieGSE4QocA== 2026-03-08T22:50:30.164 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQA2/a1p3MDfCRAAYxDtD30I6yHbieGSE4QocA=="}' 2026-03-08T22:50:30.164 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new db4c7aa6-2c2f-4b50-9ec8-0990a64fa9ca -i td/test-erasure-eio/3/new.json 2026-03-08T22:50:30.379 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:50:30.388 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/3/new.json 2026-03-08T22:50:30.389 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQA2/a1p3MDfCRAAYxDtD30I6yHbieGSE4QocA== --osd-uuid db4c7aa6-2c2f-4b50-9ec8-0990a64fa9ca 2026-03-08T22:50:30.408 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:30.409+0000 7ff9db81d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:30.410 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:30.411+0000 7ff9db81d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:30.411 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:30.412+0000 7ff9db81d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:30.412 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:30.413+0000 7ff9db81d780 -1 bdev(0x5652e4f9fc00 td/test-erasure-eio/3/block) open stat got: (1) Operation not permitted 2026-03-08T22:50:30.412 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:30.413+0000 7ff9db81d780 -1 bluestore(td/test-erasure-eio/3) _read_fsid unparsable uuid 2026-03-08T22:50:32.547 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/3/keyring 2026-03-08T22:50:32.548 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:50:32.548 INFO:tasks.workunit.client.0.vm04.stdout:adding osd3 key to auth repository 2026-03-08T22:50:32.548 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T22:50:32.548 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:50:32.820 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T22:50:32.820 INFO:tasks.workunit.client.0.vm04.stdout:start osd.3 2026-03-08T22:50:32.820 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:50:32.820 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:50:32.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:50:32.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:50:32.839 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:32.840+0000 7fe4c0213780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:32.846 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:32.847+0000 7fe4c0213780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:32.847 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:32.848+0000 7fe4c0213780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:33.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T22:50:33.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:50:33.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:50:33.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:50:33.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:50:33.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:33.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:50:33.034 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:50:33.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:33.034 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:50:33.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:33.925 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:33.926+0000 7fe4c0213780 -1 Falling back to public interface 2026-03-08T22:50:34.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:34.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:34.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:50:34.247 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:50:34.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:50:34.248 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:34.453 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:34.788 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:34.789+0000 7fe4c0213780 -1 osd.3 0 log_to_monitors true 2026-03-08T22:50:35.455 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:50:35.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:35.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:35.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:50:35.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:35.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:50:35.665 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:36.667 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:50:36.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:36.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:36.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:50:36.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:36.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:50:36.871 INFO:tasks.workunit.client.0.vm04.stdout:osd.3 up in weight 1 up_from 27 up_thru 28 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/722897165,v1:127.0.0.1:6827/722897165] [v2:127.0.0.1:6828/722897165,v1:127.0.0.1:6829/722897165] exists,up db4c7aa6-2c2f-4b50-9ec8-0990a64fa9ca 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 4 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=4 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/4 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 ' 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/4' 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/4/journal' 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:50:36.872 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:50:36.873 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:50:36.873 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:50:36.873 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:36.873 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:36.873 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:50:36.873 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:50:36.873 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:50:36.873 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:50:36.873 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:50:36.874 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:50:36.874 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:50:36.874 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:50:36.874 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:50:36.874 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:50:36.874 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:50:36.874 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:50:36.874 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:50:36.874 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/4 2026-03-08T22:50:36.875 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:50:36.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=2796f1f6-df27-459f-9a5a-7f122336e5d2 2026-03-08T22:50:36.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd4 2796f1f6-df27-459f-9a5a-7f122336e5d2' 2026-03-08T22:50:36.875 INFO:tasks.workunit.client.0.vm04.stdout:add osd4 2796f1f6-df27-459f-9a5a-7f122336e5d2 2026-03-08T22:50:36.876 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:50:36.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQA8/a1p+u0bNRAAWcYGmYkJK31yentia93evg== 2026-03-08T22:50:36.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQA8/a1p+u0bNRAAWcYGmYkJK31yentia93evg=="}' 2026-03-08T22:50:36.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 2796f1f6-df27-459f-9a5a-7f122336e5d2 -i td/test-erasure-eio/4/new.json 2026-03-08T22:50:37.098 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-08T22:50:37.107 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/4/new.json 2026-03-08T22:50:37.108 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 4 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/4 --osd-journal=td/test-erasure-eio/4/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQA8/a1p+u0bNRAAWcYGmYkJK31yentia93evg== --osd-uuid 2796f1f6-df27-459f-9a5a-7f122336e5d2 2026-03-08T22:50:37.128 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:37.129+0000 7f353ab3f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:37.130 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:37.131+0000 7f353ab3f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:37.131 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:37.132+0000 7f353ab3f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:37.131 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:37.132+0000 7f353ab3f780 -1 bdev(0x56359fd6bc00 td/test-erasure-eio/4/block) open stat got: (1) Operation not permitted 2026-03-08T22:50:37.131 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:37.132+0000 7f353ab3f780 -1 bluestore(td/test-erasure-eio/4) _read_fsid unparsable uuid 2026-03-08T22:50:39.257 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/4/keyring 2026-03-08T22:50:39.257 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:50:39.258 INFO:tasks.workunit.client.0.vm04.stdout:adding osd4 key to auth repository 2026-03-08T22:50:39.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd4 key to auth repository 2026-03-08T22:50:39.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/4/keyring auth add osd.4 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:50:39.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.4 2026-03-08T22:50:39.540 INFO:tasks.workunit.client.0.vm04.stdout:start osd.4 2026-03-08T22:50:39.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 4 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/4 --osd-journal=td/test-erasure-eio/4/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:50:39.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:50:39.541 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:50:39.543 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:50:39.559 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:39.560+0000 7f4f3849d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:39.566 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:39.567+0000 7f4f3849d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:39.567 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:39.568+0000 7f4f3849d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:39.761 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 4 2026-03-08T22:50:39.761 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:50:39.761 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=4 2026-03-08T22:50:39.761 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:50:39.761 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:50:39.761 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:39.761 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:50:39.761 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:50:39.762 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:39.762 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:50:39.970 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:40.376 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:40.377+0000 7f4f3849d780 -1 Falling back to public interface 2026-03-08T22:50:40.972 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:50:40.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:40.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:40.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:50:40.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:40.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:50:41.194 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:41.239 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:41.240+0000 7f4f3849d780 -1 osd.4 0 log_to_monitors true 2026-03-08T22:50:42.156 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:42.157+0000 7f4f33c3e640 -1 osd.4 0 waiting for initial osdmap 2026-03-08T22:50:42.195 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:42.195 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:42.195 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:50:42.196 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:50:42.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:42.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:50:42.421 INFO:tasks.workunit.client.0.vm04.stdout:osd.4 up in weight 1 up_from 35 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6834/3518619811,v1:127.0.0.1:6835/3518619811] [v2:127.0.0.1:6836/3518619811,v1:127.0.0.1:6837/3518619811] exists,up 2796f1f6-df27-459f-9a5a-7f122336e5d2 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 5 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=5 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/5 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 ' 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/5' 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/5/journal' 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:42.422 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:42.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:50:42.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:50:42.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:50:42.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:50:42.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:50:42.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:50:42.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:50:42.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:50:42.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:50:42.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:50:42.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:50:42.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:50:42.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:50:42.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/5 2026-03-08T22:50:42.431 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:50:42.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=1420aa7b-5c5a-40af-9901-456b4fa36744 2026-03-08T22:50:42.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd5 1420aa7b-5c5a-40af-9901-456b4fa36744' 2026-03-08T22:50:42.432 INFO:tasks.workunit.client.0.vm04.stdout:add osd5 1420aa7b-5c5a-40af-9901-456b4fa36744 2026-03-08T22:50:42.432 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:50:42.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBC/a1pIMiqGhAAqFSgabcRIYIhAxWZdHQYOQ== 2026-03-08T22:50:42.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBC/a1pIMiqGhAAqFSgabcRIYIhAxWZdHQYOQ=="}' 2026-03-08T22:50:42.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 1420aa7b-5c5a-40af-9901-456b4fa36744 -i td/test-erasure-eio/5/new.json 2026-03-08T22:50:42.657 INFO:tasks.workunit.client.0.vm04.stdout:5 2026-03-08T22:50:42.665 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/5/new.json 2026-03-08T22:50:42.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 5 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/5 --osd-journal=td/test-erasure-eio/5/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBC/a1pIMiqGhAAqFSgabcRIYIhAxWZdHQYOQ== --osd-uuid 1420aa7b-5c5a-40af-9901-456b4fa36744 2026-03-08T22:50:42.686 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:42.686+0000 7f0c0fa1d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:42.687 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:42.689+0000 7f0c0fa1d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:42.688 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:42.689+0000 7f0c0fa1d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:42.689 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:42.690+0000 7f0c0fa1d780 -1 bdev(0x55a87bfbdc00 td/test-erasure-eio/5/block) open stat got: (1) Operation not permitted 2026-03-08T22:50:42.689 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:42.690+0000 7f0c0fa1d780 -1 bluestore(td/test-erasure-eio/5) _read_fsid unparsable uuid 2026-03-08T22:50:44.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/5/keyring 2026-03-08T22:50:44.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:50:44.814 INFO:tasks.workunit.client.0.vm04.stdout:adding osd5 key to auth repository 2026-03-08T22:50:44.814 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd5 key to auth repository 2026-03-08T22:50:44.814 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/5/keyring auth add osd.5 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:50:45.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.5 2026-03-08T22:50:45.098 INFO:tasks.workunit.client.0.vm04.stdout:start osd.5 2026-03-08T22:50:45.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 5 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/5 --osd-journal=td/test-erasure-eio/5/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:50:45.098 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:50:45.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:50:45.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:50:45.118 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:45.119+0000 7f76c8423780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:45.126 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:45.127+0000 7f76c8423780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:45.127 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:45.128+0000 7f76c8423780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:45.319 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 5 2026-03-08T22:50:45.319 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:50:45.319 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=5 2026-03-08T22:50:45.319 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:50:45.319 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:50:45.319 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:45.319 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:50:45.319 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:50:45.319 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:45.319 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:50:45.534 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:46.444 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:46.445+0000 7f76c8423780 -1 Falling back to public interface 2026-03-08T22:50:46.536 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:50:46.536 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:46.536 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:46.536 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:50:46.536 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:46.537 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:50:46.742 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:47.299 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:47.300+0000 7f76c8423780 -1 osd.5 0 log_to_monitors true 2026-03-08T22:50:47.745 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:50:47.745 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:47.745 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:47.745 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:50:47.745 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:47.745 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:50:47.991 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:48.625 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:48.626+0000 7f76c3618640 -1 osd.5 0 waiting for initial osdmap 2026-03-08T22:50:48.993 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:50:48.993 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:48.993 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:48.993 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:50:48.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:48.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:50:49.216 INFO:tasks.workunit.client.0.vm04.stdout:osd.5 up in weight 1 up_from 43 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6842/3812243135,v1:127.0.0.1:6843/3812243135] [v2:127.0.0.1:6844/3812243135,v1:127.0.0.1:6845/3812243135] exists,up 1420aa7b-5c5a-40af-9901-456b4fa36744 2026-03-08T22:50:49.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:50:49.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:50:49.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 6 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=6 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/6 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 ' 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/6' 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/6/journal' 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:50:49.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:50:49.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:50:49.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:50:49.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:50:49.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:50:49.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:50:49.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:50:49.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/6 2026-03-08T22:50:49.219 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:50:49.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=9ede22cb-ceb1-4a2c-9896-293bc0917e56 2026-03-08T22:50:49.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd6 9ede22cb-ceb1-4a2c-9896-293bc0917e56' 2026-03-08T22:50:49.220 INFO:tasks.workunit.client.0.vm04.stdout:add osd6 9ede22cb-ceb1-4a2c-9896-293bc0917e56 2026-03-08T22:50:49.220 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:50:49.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBJ/a1p6/oEDhAAdQegNgwrR8n6QhvStsVzYg== 2026-03-08T22:50:49.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBJ/a1p6/oEDhAAdQegNgwrR8n6QhvStsVzYg=="}' 2026-03-08T22:50:49.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 9ede22cb-ceb1-4a2c-9896-293bc0917e56 -i td/test-erasure-eio/6/new.json 2026-03-08T22:50:49.459 INFO:tasks.workunit.client.0.vm04.stdout:6 2026-03-08T22:50:49.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/6/new.json 2026-03-08T22:50:49.471 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 6 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/6 --osd-journal=td/test-erasure-eio/6/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBJ/a1p6/oEDhAAdQegNgwrR8n6QhvStsVzYg== --osd-uuid 9ede22cb-ceb1-4a2c-9896-293bc0917e56 2026-03-08T22:50:49.492 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:49.493+0000 7f7d938d2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:49.494 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:49.495+0000 7f7d938d2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:49.496 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:49.497+0000 7f7d938d2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:49.496 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:49.497+0000 7f7d938d2780 -1 bdev(0x55fa51837c00 td/test-erasure-eio/6/block) open stat got: (1) Operation not permitted 2026-03-08T22:50:49.496 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:49.497+0000 7f7d938d2780 -1 bluestore(td/test-erasure-eio/6) _read_fsid unparsable uuid 2026-03-08T22:50:52.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/6/keyring 2026-03-08T22:50:52.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:50:52.118 INFO:tasks.workunit.client.0.vm04.stdout:adding osd6 key to auth repository 2026-03-08T22:50:52.118 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd6 key to auth repository 2026-03-08T22:50:52.118 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/6/keyring auth add osd.6 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:50:52.406 INFO:tasks.workunit.client.0.vm04.stdout:start osd.6 2026-03-08T22:50:52.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.6 2026-03-08T22:50:52.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 6 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/6 --osd-journal=td/test-erasure-eio/6/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:50:52.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:50:52.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:50:52.409 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:50:52.429 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:52.429+0000 7f899d360780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:52.433 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:52.434+0000 7f899d360780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:52.436 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:52.436+0000 7f899d360780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:52.621 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 6 2026-03-08T22:50:52.621 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:50:52.621 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=6 2026-03-08T22:50:52.621 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:50:52.621 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:50:52.621 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:50:52.621 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:52.622 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:50:52.622 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:52.622 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:50:52.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:52.988 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:52.989+0000 7f899d360780 -1 Falling back to public interface 2026-03-08T22:50:53.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:53.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:53.829 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:50:53.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:50:53.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:50:53.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:53.848 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:50:53.849+0000 7f899d360780 -1 osd.6 0 log_to_monitors true 2026-03-08T22:50:54.042 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:55.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:55.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:55.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:50:55.043 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:50:55.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:55.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:50:55.253 INFO:tasks.workunit.client.0.vm04.stdout:osd.6 up in weight 1 up_from 51 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6850/3490812740,v1:127.0.0.1:6851/3490812740] [v2:127.0.0.1:6852/3490812740,v1:127.0.0.1:6853/3490812740] exists,up 9ede22cb-ceb1-4a2c-9896-293bc0917e56 2026-03-08T22:50:55.253 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:50:55.253 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:50:55.253 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:50:55.254 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: get_asok_path osd.0 2026-03-08T22:50:55.254 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:50:55.254 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:50:55.254 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:50:55.254 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:55.254 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:50:55.254 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.0.asok 2026-03-08T22:50:55.254 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: CEPH_ARGS= 2026-03-08T22:50:55.255 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.0.asok log flush 2026-03-08T22:50:55.300 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:50:55.304 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:56: setup_osds: grep 'load: jerasure.*lrc' td/test-erasure-eio/osd.0.log 2026-03-08T22:50:55.307 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:50:14.660+0000 7f8a276c4780 0 load: jerasure load: lrc 2026-03-08T22:50:55.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:457: TEST_ec_recovery_multiple_objects: CEPH_ARGS='--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:50:55.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:459: TEST_ec_recovery_multiple_objects: local poolname=pool-jerasure 2026-03-08T22:50:55.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:460: TEST_ec_recovery_multiple_objects: create_erasure_coded_pool pool-jerasure 3 2 2026-03-08T22:50:55.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:67: create_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:50:55.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:68: create_erasure_coded_pool: shift 2026-03-08T22:50:55.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:69: create_erasure_coded_pool: local k=3 2026-03-08T22:50:55.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:70: create_erasure_coded_pool: shift 2026-03-08T22:50:55.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:71: create_erasure_coded_pool: local m=2 2026-03-08T22:50:55.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:72: create_erasure_coded_pool: shift 2026-03-08T22:50:55.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:74: create_erasure_coded_pool: ceph osd erasure-code-profile set myprofile plugin=jerasure k=3 m=2 crush-failure-domain=osd 2026-03-08T22:50:55.580 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:78: create_erasure_coded_pool: create_pool pool-jerasure 1 1 erasure myprofile 2026-03-08T22:50:55.580 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create pool-jerasure 1 1 erasure myprofile 2026-03-08T22:50:55.941 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' already exists 2026-03-08T22:50:55.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:50:56.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:80: create_erasure_coded_pool: wait_for_clean 2026-03-08T22:50:56.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:50:56.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:50:56.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:50:56.952 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:50:56.953 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:50:56.953 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:50:56.953 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:50:56.953 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:50:56.953 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:50:57.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:50:57.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:50:57.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:50:57.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:50:57.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:50:57.028 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:50:57.238 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:50:57.238 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:50:57.238 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:50:57.238 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:50:57.238 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:50:57.238 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:50:57.238 INFO:tasks.workunit.client.0.vm04.stderr:6' 2026-03-08T22:50:57.239 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:50:57.239 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:57.239 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:50:57.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803787 2026-03-08T22:50:57.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803787 2026-03-08T22:50:57.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787' 2026-03-08T22:50:57.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:57.309 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:50:57.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574857 2026-03-08T22:50:57.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574857 2026-03-08T22:50:57.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574857' 2026-03-08T22:50:57.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:57.378 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:50:57.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=77309411336 2026-03-08T22:50:57.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 77309411336 2026-03-08T22:50:57.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574857 2-77309411336' 2026-03-08T22:50:57.446 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:57.447 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:50:57.514 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964116999 2026-03-08T22:50:57.514 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964116999 2026-03-08T22:50:57.514 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574857 2-77309411336 3-115964116999' 2026-03-08T22:50:57.514 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:57.514 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:50:57.580 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855366 2026-03-08T22:50:57.580 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855366 2026-03-08T22:50:57.580 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574857 2-77309411336 3-115964116999 4-150323855366' 2026-03-08T22:50:57.580 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:57.580 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:50:57.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=184683593732 2026-03-08T22:50:57.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 184683593732 2026-03-08T22:50:57.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574857 2-77309411336 3-115964116999 4-150323855366 5-184683593732' 2026-03-08T22:50:57.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:57.647 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T22:50:57.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=219043332099 2026-03-08T22:50:57.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 219043332099 2026-03-08T22:50:57.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574857 2-77309411336 3-115964116999 4-150323855366 5-184683593732 6-219043332099' 2026-03-08T22:50:57.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:57.714 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803787 2026-03-08T22:50:57.714 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:57.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:50:57.715 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803787 2026-03-08T22:50:57.716 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:57.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803787 2026-03-08T22:50:57.716 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803787 2026-03-08T22:50:57.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803787' 2026-03-08T22:50:57.717 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:50:57.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803784 -lt 25769803787 2026-03-08T22:50:57.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:50:58.920 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:50:58.920 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:50:59.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803787 -lt 25769803787 2026-03-08T22:50:59.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:59.129 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574857 2026-03-08T22:50:59.129 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:59.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:50:59.130 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574857 2026-03-08T22:50:59.130 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:59.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574857 2026-03-08T22:50:59.131 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 55834574857 2026-03-08T22:50:59.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574857' 2026-03-08T22:50:59.131 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:50:59.343 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574858 -lt 55834574857 2026-03-08T22:50:59.343 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:59.343 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-77309411336 2026-03-08T22:50:59.343 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:59.344 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:50:59.345 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-77309411336 2026-03-08T22:50:59.345 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:59.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=77309411336 2026-03-08T22:50:59.346 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 77309411336 2026-03-08T22:50:59.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 77309411336' 2026-03-08T22:50:59.346 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:50:59.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 77309411336 -lt 77309411336 2026-03-08T22:50:59.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:59.554 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964116999 2026-03-08T22:50:59.554 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:59.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:50:59.555 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964116999 2026-03-08T22:50:59.555 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:59.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964116999 2026-03-08T22:50:59.556 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 115964116999 2026-03-08T22:50:59.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964116999' 2026-03-08T22:50:59.556 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:50:59.763 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964116999 -lt 115964116999 2026-03-08T22:50:59.763 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:59.764 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-150323855366 2026-03-08T22:50:59.764 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:59.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:50:59.765 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-150323855366 2026-03-08T22:50:59.765 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:59.766 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855366 2026-03-08T22:50:59.766 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.4 seq 150323855366 2026-03-08T22:50:59.766 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 150323855366' 2026-03-08T22:50:59.766 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:50:59.970 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855366 -lt 150323855366 2026-03-08T22:50:59.970 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:59.971 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-184683593732 2026-03-08T22:50:59.971 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:59.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:50:59.972 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-184683593732 2026-03-08T22:50:59.972 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:59.973 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=184683593732 2026-03-08T22:50:59.973 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.5 seq 184683593732 2026-03-08T22:50:59.973 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 184683593732' 2026-03-08T22:50:59.973 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:51:00.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 184683593732 -lt 184683593732 2026-03-08T22:51:00.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:00.175 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-219043332099 2026-03-08T22:51:00.175 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:00.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T22:51:00.176 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-219043332099 2026-03-08T22:51:00.177 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:00.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=219043332099 2026-03-08T22:51:00.177 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.6 seq 219043332099 2026-03-08T22:51:00.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 219043332099' 2026-03-08T22:51:00.177 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T22:51:00.377 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 219043332100 -lt 219043332099 2026-03-08T22:51:00.377 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:51:00.378 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:51:00.378 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:51:00.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:51:00.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:51:00.646 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:51:00.646 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:51:00.646 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:51:00.646 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:51:00.647 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:51:00.647 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:51:00.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:51:00.841 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:51:00.841 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:51:00.841 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:51:01.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:51:01.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:51:01.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:51:01.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:462: TEST_ec_recovery_multiple_objects: rados_put td/test-erasure-eio pool-jerasure test1 2026-03-08T22:51:01.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T22:51:01.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T22:51:01.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=test1 2026-03-08T22:51:01.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:51:01.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T22:51:01.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:51:01.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T22:51:01.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:51:01.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T22:51:01.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:51:01.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T22:51:01.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put test1 td/test-erasure-eio/ORIGINAL 2026-03-08T22:51:01.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:463: TEST_ec_recovery_multiple_objects: rados_put td/test-erasure-eio pool-jerasure test2 2026-03-08T22:51:01.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T22:51:01.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T22:51:01.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=test2 2026-03-08T22:51:01.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:51:01.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T22:51:01.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:51:01.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T22:51:01.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:51:01.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T22:51:01.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:51:01.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T22:51:01.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put test2 td/test-erasure-eio/ORIGINAL 2026-03-08T22:51:01.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:464: TEST_ec_recovery_multiple_objects: rados_put td/test-erasure-eio pool-jerasure test3 2026-03-08T22:51:01.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T22:51:01.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T22:51:01.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=test3 2026-03-08T22:51:01.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:51:01.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T22:51:01.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:51:01.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T22:51:01.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:51:01.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T22:51:01.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:51:01.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T22:51:01.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put test3 td/test-erasure-eio/ORIGINAL 2026-03-08T22:51:01.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:466: TEST_ec_recovery_multiple_objects: ceph osd out 0 2026-03-08T22:51:01.460 INFO:tasks.workunit.client.0.vm04.stderr:osd.0 is already out. 2026-03-08T22:51:01.470 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:469: TEST_ec_recovery_multiple_objects: wait_for_clean 2026-03-08T22:51:01.470 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:51:01.470 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:51:01.470 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:51:01.470 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:51:01.470 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:51:01.470 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:51:01.471 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:51:01.471 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:51:01.471 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:51:01.549 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:51:01.549 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:51:01.549 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:51:01.549 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:51:01.549 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:51:01.549 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:51:01.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:51:01.751 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:51:01.751 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:51:01.751 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:51:01.751 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:51:01.751 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:51:01.751 INFO:tasks.workunit.client.0.vm04.stderr:6' 2026-03-08T22:51:01.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:51:01.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:01.752 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:51:01.816 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803790 2026-03-08T22:51:01.816 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803790 2026-03-08T22:51:01.816 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803790' 2026-03-08T22:51:01.816 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:01.816 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:51:01.881 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574860 2026-03-08T22:51:01.881 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574860 2026-03-08T22:51:01.881 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803790 1-55834574860' 2026-03-08T22:51:01.881 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:01.881 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:51:01.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=77309411339 2026-03-08T22:51:01.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 77309411339 2026-03-08T22:51:01.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803790 1-55834574860 2-77309411339' 2026-03-08T22:51:01.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:01.944 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:51:02.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964117002 2026-03-08T22:51:02.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964117002 2026-03-08T22:51:02.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803790 1-55834574860 2-77309411339 3-115964117002' 2026-03-08T22:51:02.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:02.010 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:51:02.081 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855368 2026-03-08T22:51:02.081 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855368 2026-03-08T22:51:02.081 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803790 1-55834574860 2-77309411339 3-115964117002 4-150323855368' 2026-03-08T22:51:02.081 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:02.081 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:51:02.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=184683593735 2026-03-08T22:51:02.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 184683593735 2026-03-08T22:51:02.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803790 1-55834574860 2-77309411339 3-115964117002 4-150323855368 5-184683593735' 2026-03-08T22:51:02.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:02.145 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T22:51:02.208 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=219043332102 2026-03-08T22:51:02.208 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 219043332102 2026-03-08T22:51:02.208 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803790 1-55834574860 2-77309411339 3-115964117002 4-150323855368 5-184683593735 6-219043332102' 2026-03-08T22:51:02.208 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:02.208 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803790 2026-03-08T22:51:02.208 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:02.209 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:51:02.210 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803790 2026-03-08T22:51:02.210 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:02.210 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803790 2026-03-08T22:51:02.210 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803790 2026-03-08T22:51:02.211 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803790' 2026-03-08T22:51:02.211 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:51:02.411 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803790 -lt 25769803790 2026-03-08T22:51:02.411 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:02.412 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574860 2026-03-08T22:51:02.412 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:02.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:51:02.413 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574860 2026-03-08T22:51:02.413 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:02.414 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574860 2026-03-08T22:51:02.414 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 55834574860 2026-03-08T22:51:02.414 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574860' 2026-03-08T22:51:02.414 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:51:02.610 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574860 -lt 55834574860 2026-03-08T22:51:02.610 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:02.610 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-77309411339 2026-03-08T22:51:02.610 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:02.611 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:51:02.611 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-77309411339 2026-03-08T22:51:02.611 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:02.612 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=77309411339 2026-03-08T22:51:02.612 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 77309411339' 2026-03-08T22:51:02.612 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 77309411339 2026-03-08T22:51:02.612 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:51:02.810 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 77309411339 -lt 77309411339 2026-03-08T22:51:02.810 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:02.811 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964117002 2026-03-08T22:51:02.811 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:02.812 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:51:02.812 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964117002 2026-03-08T22:51:02.812 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:02.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964117002 2026-03-08T22:51:02.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964117002' 2026-03-08T22:51:02.813 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 115964117002 2026-03-08T22:51:02.813 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:51:03.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964117002 -lt 115964117002 2026-03-08T22:51:03.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:03.010 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-150323855368 2026-03-08T22:51:03.011 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:03.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:51:03.012 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-150323855368 2026-03-08T22:51:03.012 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:03.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855368 2026-03-08T22:51:03.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 150323855368' 2026-03-08T22:51:03.013 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.4 seq 150323855368 2026-03-08T22:51:03.013 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:51:03.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855368 -lt 150323855368 2026-03-08T22:51:03.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:03.218 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-184683593735 2026-03-08T22:51:03.218 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:03.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:51:03.219 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-184683593735 2026-03-08T22:51:03.219 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:03.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=184683593735 2026-03-08T22:51:03.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 184683593735' 2026-03-08T22:51:03.220 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.5 seq 184683593735 2026-03-08T22:51:03.220 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:51:03.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 184683593735 -lt 184683593735 2026-03-08T22:51:03.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:03.417 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-219043332102 2026-03-08T22:51:03.417 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:03.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T22:51:03.419 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:03.419 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-219043332102 2026-03-08T22:51:03.420 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=219043332102 2026-03-08T22:51:03.420 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 219043332102' 2026-03-08T22:51:03.420 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.6 seq 219043332102 2026-03-08T22:51:03.420 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T22:51:03.618 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 219043332102 -lt 219043332102 2026-03-08T22:51:03.618 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:51:03.619 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:51:03.619 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:51:03.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:51:03.883 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:51:03.883 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:51:03.883 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:51:03.883 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:51:03.883 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:51:03.883 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:51:03.883 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:51:04.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=2 2026-03-08T22:51:04.084 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:51:04.084 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:51:04.084 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:51:04.358 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 2 = 5 2026-03-08T22:51:04.358 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 2 '!=' -1 2026-03-08T22:51:04.358 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1674: wait_for_clean: loop=0 2026-03-08T22:51:04.358 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1675: wait_for_clean: num_active_clean=2 2026-03-08T22:51:04.358 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:51:04.358 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:51:04.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:51:04.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:51:04.460 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:51:04.460 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:51:04.460 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:51:04.460 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:51:04.460 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:51:04.460 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:51:04.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=3 2026-03-08T22:51:04.660 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:51:04.660 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:51:04.660 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:51:04.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 3 = 5 2026-03-08T22:51:04.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 3 '!=' 2 2026-03-08T22:51:04.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1674: wait_for_clean: loop=0 2026-03-08T22:51:04.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1675: wait_for_clean: num_active_clean=3 2026-03-08T22:51:04.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:51:04.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:51:05.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:51:05.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:51:05.031 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:51:05.031 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:51:05.031 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:51:05.031 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:51:05.031 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:51:05.031 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:51:05.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=3 2026-03-08T22:51:05.230 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:51:05.231 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:51:05.231 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:51:05.499 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 3 = 5 2026-03-08T22:51:05.499 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 3 '!=' 3 2026-03-08T22:51:05.499 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:51:05.499 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:51:05.499 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:51:05.499 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:51:05.499 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:51:05.501 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:51:05.501 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:51:05.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=null 2026-03-08T22:51:05.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test null '!=' null 2026-03-08T22:51:05.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1678: wait_for_clean: (( 1 >= 13 )) 2026-03-08T22:51:05.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:51:05.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.2 2026-03-08T22:51:05.974 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:51:05.974 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:51:05.975 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:51:05.975 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:51:05.975 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:51:05.975 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:51:05.975 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:51:05.975 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:51:06.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=3 2026-03-08T22:51:06.178 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:51:06.179 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:51:06.179 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:51:06.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 3 = 5 2026-03-08T22:51:06.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 3 '!=' 3 2026-03-08T22:51:06.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:51:06.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:51:06.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:51:06.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:51:06.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:51:06.449 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:51:06.449 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:51:06.727 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=3071 2026-03-08T22:51:06.727 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 3071 '!=' null 2026-03-08T22:51:06.727 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:51:06.727 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:51:06.727 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:51:06.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:51:06.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:51:06.828 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:51:06.828 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:51:06.828 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:51:06.828 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:51:06.828 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:51:06.828 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:51:07.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:51:07.032 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:51:07.032 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:51:07.032 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:51:07.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:51:07.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:51:07.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:51:07.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:471: TEST_ec_recovery_multiple_objects: rados_get td/test-erasure-eio pool-jerasure test1 2026-03-08T22:51:07.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:51:07.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:51:07.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=test1 2026-03-08T22:51:07.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=ok 2026-03-08T22:51:07.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' ok = fail ']' 2026-03-08T22:51:07.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:120: rados_get: rados --pool pool-jerasure get test1 td/test-erasure-eio/COPY 2026-03-08T22:51:07.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:121: rados_get: diff td/test-erasure-eio/ORIGINAL td/test-erasure-eio/COPY 2026-03-08T22:51:07.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:122: rados_get: rm td/test-erasure-eio/COPY 2026-03-08T22:51:07.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:472: TEST_ec_recovery_multiple_objects: rados_get td/test-erasure-eio pool-jerasure test2 2026-03-08T22:51:07.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:51:07.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:51:07.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=test2 2026-03-08T22:51:07.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=ok 2026-03-08T22:51:07.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' ok = fail ']' 2026-03-08T22:51:07.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:120: rados_get: rados --pool pool-jerasure get test2 td/test-erasure-eio/COPY 2026-03-08T22:51:07.351 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:121: rados_get: diff td/test-erasure-eio/ORIGINAL td/test-erasure-eio/COPY 2026-03-08T22:51:07.351 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:122: rados_get: rm td/test-erasure-eio/COPY 2026-03-08T22:51:07.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:473: TEST_ec_recovery_multiple_objects: rados_get td/test-erasure-eio pool-jerasure test3 2026-03-08T22:51:07.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:51:07.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:51:07.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=test3 2026-03-08T22:51:07.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=ok 2026-03-08T22:51:07.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' ok = fail ']' 2026-03-08T22:51:07.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:120: rados_get: rados --pool pool-jerasure get test3 td/test-erasure-eio/COPY 2026-03-08T22:51:07.373 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:121: rados_get: diff td/test-erasure-eio/ORIGINAL td/test-erasure-eio/COPY 2026-03-08T22:51:07.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:122: rados_get: rm td/test-erasure-eio/COPY 2026-03-08T22:51:07.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:475: TEST_ec_recovery_multiple_objects: delete_erasure_coded_pool pool-jerasure 2026-03-08T22:51:07.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:84: delete_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:51:07.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:85: delete_erasure_coded_pool: ceph osd pool delete pool-jerasure pool-jerasure --yes-i-really-really-mean-it 2026-03-08T22:51:07.631 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' does not exist 2026-03-08T22:51:07.640 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:86: delete_erasure_coded_pool: ceph osd erasure-code-profile rm myprofile 2026-03-08T22:51:07.900 INFO:tasks.workunit.client.0.vm04.stderr:erasure-code-profile myprofile does not exist 2026-03-08T22:51:07.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:42: run: teardown td/test-erasure-eio 2026-03-08T22:51:07.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:51:07.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:51:07.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:51:07.910 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:51:07.910 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:51:07.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:51:07.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:51:07.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:51:08.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:51:08.070 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:51:08.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:51:08.071 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:51:08.072 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:51:08.072 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:51:08.073 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:51:08.073 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:51:08.073 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:51:08.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:51:08.074 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:51:08.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:51:08.075 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:51:08.075 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:51:08.100 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:51:08.100 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:08.100 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:51:08.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:51:08.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:51:08.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:51:08.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:32: run: for func in $funcs 2026-03-08T22:51:08.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:33: run: setup td/test-erasure-eio 2026-03-08T22:51:08.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-eio 2026-03-08T22:51:08.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-eio 2026-03-08T22:51:08.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:51:08.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:51:08.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:51:08.104 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:51:08.104 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:51:08.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:51:08.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:51:08.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:51:08.105 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:51:08.106 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:51:08.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:51:08.106 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:51:08.107 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:51:08.107 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:51:08.107 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:51:08.108 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:51:08.108 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:51:08.108 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:51:08.108 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:51:08.109 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:51:08.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:51:08.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:51:08.111 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:51:08.111 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:08.111 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:51:08.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:51:08.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:51:08.112 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:51:08.112 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-eio 2026-03-08T22:51:08.112 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:51:08.112 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:08.113 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:51:08.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.102080 2026-03-08T22:51:08.114 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:51:08.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:51:08.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:51:08.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-eio 1' TERM HUP INT 2026-03-08T22:51:08.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:34: run: run_mon td/test-erasure-eio a 2026-03-08T22:51:08.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-eio 2026-03-08T22:51:08.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:51:08.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:51:08.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:51:08.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-eio/a 2026-03-08T22:51:08.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-eio/a --run-dir=td/test-erasure-eio 2026-03-08T22:51:08.137 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:51:08.137 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:51:08.137 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:51:08.138 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:51:08.138 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:08.138 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:51:08.138 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:51:08.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-eio/a '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-eio/log --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:51:08.164 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:51:08.165 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:51:08.165 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:51:08.165 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:51:08.165 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:51:08.165 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:51:08.165 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:51:08.165 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:51:08.165 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:51:08.166 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:51:08.166 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:08.166 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:51:08.166 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:51:08.166 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:51:08.166 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get fsid 2026-03-08T22:51:08.211 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:51:08.211 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:51:08.211 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:51:08.211 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:51:08.211 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:51:08.211 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:51:08.211 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:51:08.211 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:51:08.212 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:51:08.212 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:08.212 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:51:08.212 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:51:08.212 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:51:08.212 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get mon_host 2026-03-08T22:51:08.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:35: run: run_mgr td/test-erasure-eio x 2026-03-08T22:51:08.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-eio 2026-03-08T22:51:08.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:51:08.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:51:08.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:51:08.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-eio/x 2026-03-08T22:51:08.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:51:08.361 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:51:08.361 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:51:08.361 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:51:08.361 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:51:08.361 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:08.361 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:51:08.362 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:51:08.362 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:51:08.363 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-eio/x '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:51:08.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:36: run: create_pool rbd 4 2026-03-08T22:51:08.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:51:08.482 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd' already exists 2026-03-08T22:51:08.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:51:09.493 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: get_asok_path mon.a 2026-03-08T22:51:09.493 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:51:09.493 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:51:09.493 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:51:09.493 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:09.493 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:51:09.493 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:51:09.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: CEPH_ARGS= 2026-03-08T22:51:09.494 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-mon.a.asok log flush 2026-03-08T22:51:09.557 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:51:09.571 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:40: run: grep 'load: jerasure.*lrc' td/test-erasure-eio/mon.a.log 2026-03-08T22:51:09.573 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:51:08.158+0000 7f006580dd80 0 load: jerasure load: lrc 2026-03-08T22:51:09.573 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:41: run: TEST_ec_recovery_multiple_objects_eio td/test-erasure-eio 2026-03-08T22:51:09.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:480: TEST_ec_recovery_multiple_objects_eio: local dir=td/test-erasure-eio 2026-03-08T22:51:09.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:481: TEST_ec_recovery_multiple_objects_eio: local objname=myobject 2026-03-08T22:51:09.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:483: TEST_ec_recovery_multiple_objects_eio: ORIG_ARGS='--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:51:09.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:484: TEST_ec_recovery_multiple_objects_eio: CEPH_ARGS+=' --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 ' 2026-03-08T22:51:09.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:485: TEST_ec_recovery_multiple_objects_eio: setup_osds 7 2026-03-08T22:51:09.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:47: setup_osds: local count=7 2026-03-08T22:51:09.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:48: setup_osds: shift 2026-03-08T22:51:09.576 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: expr 7 - 1 2026-03-08T22:51:09.576 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: seq 0 6 2026-03-08T22:51:09.578 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:51:09.578 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 0 2026-03-08T22:51:09.578 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:51:09.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:51:09.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:51:09.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:51:09.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/0 2026-03-08T22:51:09.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 ' 2026-03-08T22:51:09.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:51:09.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:51:09.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:51:09.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/0' 2026-03-08T22:51:09.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/0/journal' 2026-03-08T22:51:09.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:51:09.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:51:09.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:51:09.579 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:51:09.579 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:51:09.579 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:51:09.580 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:51:09.580 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:09.580 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:51:09.580 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:51:09.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:51:09.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:51:09.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:51:09.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:51:09.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:51:09.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:51:09.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:51:09.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:51:09.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:51:09.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:51:09.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:51:09.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:51:09.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/0 2026-03-08T22:51:09.582 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:51:09.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=873cf7dc-fef3-4562-ad74-ad53e7fd57b2 2026-03-08T22:51:09.583 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 873cf7dc-fef3-4562-ad74-ad53e7fd57b2 2026-03-08T22:51:09.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 873cf7dc-fef3-4562-ad74-ad53e7fd57b2' 2026-03-08T22:51:09.583 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:51:09.613 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBd/a1psESfJBAAZjd+Xj266Dosy8yZcvINdw== 2026-03-08T22:51:09.613 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBd/a1psESfJBAAZjd+Xj266Dosy8yZcvINdw=="}' 2026-03-08T22:51:09.613 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 873cf7dc-fef3-4562-ad74-ad53e7fd57b2 -i td/test-erasure-eio/0/new.json 2026-03-08T22:51:09.783 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:51:09.792 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/0/new.json 2026-03-08T22:51:09.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBd/a1psESfJBAAZjd+Xj266Dosy8yZcvINdw== --osd-uuid 873cf7dc-fef3-4562-ad74-ad53e7fd57b2 2026-03-08T22:51:09.815 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:09.815+0000 7f8bc0509780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:09.816 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:09.817+0000 7f8bc0509780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:09.817 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:09.818+0000 7f8bc0509780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:09.817 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:09.818+0000 7f8bc0509780 -1 bdev(0x5565120f2800 td/test-erasure-eio/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:51:09.817 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:09.818+0000 7f8bc0509780 -1 bluestore(td/test-erasure-eio/0) _read_fsid unparsable uuid 2026-03-08T22:51:11.928 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/0/keyring 2026-03-08T22:51:11.928 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:51:11.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:51:11.929 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:51:11.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:51:12.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:51:12.230 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:51:12.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:51:12.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:51:12.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:51:12.232 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:51:12.249 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:12.249+0000 7fd3da519780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:12.253 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:12.254+0000 7fd3da519780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:12.254 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:12.255+0000 7fd3da519780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:12.440 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:51:12.440 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:51:12.440 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:51:12.440 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:51:12.440 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:51:12.440 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:12.440 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:51:12.440 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:51:12.440 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:12.440 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:51:12.651 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:12.813 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:12.814+0000 7fd3da519780 -1 Falling back to public interface 2026-03-08T22:51:13.653 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:51:13.653 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:13.653 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:13.653 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:51:13.653 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:13.653 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:51:13.679 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:13.680+0000 7fd3da519780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:51:13.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:14.893 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:51:14.893 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:14.893 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:14.893 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:51:14.893 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:14.893 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:51:15.113 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 6 up_thru 6 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3645290403,v1:127.0.0.1:6803/3645290403] [v2:127.0.0.1:6804/3645290403,v1:127.0.0.1:6805/3645290403] exists,up 873cf7dc-fef3-4562-ad74-ad53e7fd57b2 2026-03-08T22:51:15.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:51:15.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:51:15.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:51:15.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:51:15.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 1 2026-03-08T22:51:15.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:51:15.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:51:15.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:51:15.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:51:15.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T22:51:15.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 ' 2026-03-08T22:51:15.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:51:15.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:51:15.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:51:15.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T22:51:15.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T22:51:15.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:51:15.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:51:15.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:51:15.114 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:51:15.114 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:51:15.114 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:51:15.114 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:51:15.114 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:15.114 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:51:15.115 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:51:15.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:51:15.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:51:15.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:51:15.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:51:15.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:51:15.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:51:15.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:51:15.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:51:15.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:51:15.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:51:15.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:51:15.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:51:15.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T22:51:15.117 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:51:15.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=a40f4e0b-fbd4-41fe-8c5e-561bad47918d 2026-03-08T22:51:15.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 a40f4e0b-fbd4-41fe-8c5e-561bad47918d' 2026-03-08T22:51:15.117 INFO:tasks.workunit.client.0.vm04.stdout:add osd1 a40f4e0b-fbd4-41fe-8c5e-561bad47918d 2026-03-08T22:51:15.118 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:51:15.133 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBj/a1pfMf+BxAAad2In3iTpcrIy4gBIWlvaA== 2026-03-08T22:51:15.133 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBj/a1pfMf+BxAAad2In3iTpcrIy4gBIWlvaA=="}' 2026-03-08T22:51:15.133 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new a40f4e0b-fbd4-41fe-8c5e-561bad47918d -i td/test-erasure-eio/1/new.json 2026-03-08T22:51:15.358 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:51:15.368 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/1/new.json 2026-03-08T22:51:15.369 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBj/a1pfMf+BxAAad2In3iTpcrIy4gBIWlvaA== --osd-uuid a40f4e0b-fbd4-41fe-8c5e-561bad47918d 2026-03-08T22:51:15.394 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:15.395+0000 7f1a5c184780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:15.396 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:15.397+0000 7f1a5c184780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:15.397 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:15.398+0000 7f1a5c184780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:15.398 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:15.399+0000 7f1a5c184780 -1 bdev(0x557567131c00 td/test-erasure-eio/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:51:15.398 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:15.399+0000 7f1a5c184780 -1 bluestore(td/test-erasure-eio/1) _read_fsid unparsable uuid 2026-03-08T22:51:17.541 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/1/keyring 2026-03-08T22:51:17.541 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:51:17.541 INFO:tasks.workunit.client.0.vm04.stdout:adding osd1 key to auth repository 2026-03-08T22:51:17.541 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:51:17.541 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:51:17.825 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:51:17.825 INFO:tasks.workunit.client.0.vm04.stdout:start osd.1 2026-03-08T22:51:17.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:51:17.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:51:17.827 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:51:17.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:51:17.846 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:17.847+0000 7f2b8760f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:17.854 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:17.854+0000 7f2b8760f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:17.855 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:17.856+0000 7f2b8760f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:18.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:51:18.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:51:18.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:51:18.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:51:18.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:51:18.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:18.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:51:18.060 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:51:18.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:18.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:51:18.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:19.193 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:19.194+0000 7f2b8760f780 -1 Falling back to public interface 2026-03-08T22:51:19.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:19.281 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:19.281 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:51:19.281 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:51:19.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:19.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:51:19.521 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:20.062 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:20.063+0000 7f2b8760f780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:51:20.523 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:51:20.524 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:20.524 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:20.524 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:51:20.524 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:20.524 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:51:20.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:21.783 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:51:21.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:21.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:21.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:51:21.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:21.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:51:22.010 INFO:tasks.workunit.client.0.vm04.stdout:osd.1 up in weight 1 up_from 13 up_thru 13 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/766125107,v1:127.0.0.1:6811/766125107] [v2:127.0.0.1:6812/766125107,v1:127.0.0.1:6813/766125107] exists,up a40f4e0b-fbd4-41fe-8c5e-561bad47918d 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 2 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 ' 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:22.011 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:51:22.012 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:51:22.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:51:22.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:51:22.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:51:22.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:51:22.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:51:22.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:51:22.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:51:22.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:51:22.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:51:22.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:51:22.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:51:22.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:51:22.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T22:51:22.014 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:51:22.015 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=cfa19f80-81b8-4fda-b21d-87c227651b34 2026-03-08T22:51:22.015 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 cfa19f80-81b8-4fda-b21d-87c227651b34' 2026-03-08T22:51:22.015 INFO:tasks.workunit.client.0.vm04.stdout:add osd2 cfa19f80-81b8-4fda-b21d-87c227651b34 2026-03-08T22:51:22.015 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:51:22.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBq/a1prL/pARAAQVbobpR4lSYhiyJHVaai8Q== 2026-03-08T22:51:22.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBq/a1prL/pARAAQVbobpR4lSYhiyJHVaai8Q=="}' 2026-03-08T22:51:22.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new cfa19f80-81b8-4fda-b21d-87c227651b34 -i td/test-erasure-eio/2/new.json 2026-03-08T22:51:22.271 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:51:22.281 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/2/new.json 2026-03-08T22:51:22.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBq/a1prL/pARAAQVbobpR4lSYhiyJHVaai8Q== --osd-uuid cfa19f80-81b8-4fda-b21d-87c227651b34 2026-03-08T22:51:22.306 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:22.307+0000 7f030ab28780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:22.308 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:22.309+0000 7f030ab28780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:22.309 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:22.310+0000 7f030ab28780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:22.309 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:22.310+0000 7f030ab28780 -1 bdev(0x562782fe5c00 td/test-erasure-eio/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:51:22.310 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:22.311+0000 7f030ab28780 -1 bluestore(td/test-erasure-eio/2) _read_fsid unparsable uuid 2026-03-08T22:51:24.499 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/2/keyring 2026-03-08T22:51:24.499 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:51:24.499 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:51:24.499 INFO:tasks.workunit.client.0.vm04.stdout:adding osd2 key to auth repository 2026-03-08T22:51:24.500 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:51:24.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:51:24.805 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T22:51:24.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:51:24.806 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:51:24.806 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:51:24.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:51:24.827 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:24.827+0000 7f2204164780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:24.835 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:24.836+0000 7f2204164780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:24.837 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:24.837+0000 7f2204164780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:25.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:51:25.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:51:25.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:51:25.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:51:25.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:51:25.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:25.047 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:51:25.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:51:25.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:25.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:51:25.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:26.172 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:26.172+0000 7f2204164780 -1 Falling back to public interface 2026-03-08T22:51:26.269 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:51:26.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:26.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:26.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:51:26.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:26.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:51:26.484 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:27.019 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:27.020+0000 7f2204164780 -1 osd.2 0 log_to_monitors true 2026-03-08T22:51:27.486 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:51:27.486 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:27.486 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:27.486 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:51:27.486 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:27.486 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:51:27.738 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:27.944 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:27.945+0000 7f21ff903640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T22:51:28.741 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:51:28.741 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:28.741 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:28.741 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:51:28.741 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:28.741 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:51:28.966 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up in weight 1 up_from 19 up_thru 19 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/4021290065,v1:127.0.0.1:6819/4021290065] [v2:127.0.0.1:6820/4021290065,v1:127.0.0.1:6821/4021290065] exists,up cfa19f80-81b8-4fda-b21d-87c227651b34 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 3 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/3 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 ' 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/3' 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/3/journal' 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:51:28.967 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:51:28.968 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:51:28.968 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:28.968 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:51:28.968 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:51:28.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:51:28.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:51:28.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:51:28.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:51:28.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:51:28.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:51:28.969 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:51:28.969 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:51:28.969 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:51:28.969 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:51:28.969 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:51:28.969 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:51:28.969 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/3 2026-03-08T22:51:28.970 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:51:28.971 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=e26a16f8-b1b1-4fdb-a09f-5592d518f9d5 2026-03-08T22:51:28.971 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 e26a16f8-b1b1-4fdb-a09f-5592d518f9d5' 2026-03-08T22:51:28.971 INFO:tasks.workunit.client.0.vm04.stdout:add osd3 e26a16f8-b1b1-4fdb-a09f-5592d518f9d5 2026-03-08T22:51:28.971 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:51:28.987 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBw/a1pLmLhOhAAHwms2JAtHWd4pQzroVp4XQ== 2026-03-08T22:51:28.987 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBw/a1pLmLhOhAAHwms2JAtHWd4pQzroVp4XQ=="}' 2026-03-08T22:51:28.987 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new e26a16f8-b1b1-4fdb-a09f-5592d518f9d5 -i td/test-erasure-eio/3/new.json 2026-03-08T22:51:29.222 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:51:29.232 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/3/new.json 2026-03-08T22:51:29.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBw/a1pLmLhOhAAHwms2JAtHWd4pQzroVp4XQ== --osd-uuid e26a16f8-b1b1-4fdb-a09f-5592d518f9d5 2026-03-08T22:51:29.257 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:29.257+0000 7fb52dc1c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:29.259 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:29.260+0000 7fb52dc1c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:29.262 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:29.263+0000 7fb52dc1c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:29.263 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:29.264+0000 7fb52dc1c780 -1 bdev(0x55b08ca45c00 td/test-erasure-eio/3/block) open stat got: (1) Operation not permitted 2026-03-08T22:51:29.263 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:29.264+0000 7fb52dc1c780 -1 bluestore(td/test-erasure-eio/3) _read_fsid unparsable uuid 2026-03-08T22:51:31.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/3/keyring 2026-03-08T22:51:31.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:51:31.419 INFO:tasks.workunit.client.0.vm04.stdout:adding osd3 key to auth repository 2026-03-08T22:51:31.420 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T22:51:31.420 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:51:31.722 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T22:51:31.722 INFO:tasks.workunit.client.0.vm04.stdout:start osd.3 2026-03-08T22:51:31.722 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:51:31.723 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:51:31.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:51:31.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:51:31.744 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:31.744+0000 7fc805504780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:31.752 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:31.753+0000 7fc805504780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:31.754 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:31.754+0000 7fc805504780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:31.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T22:51:31.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:51:31.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:51:31.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:51:31.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:51:31.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:31.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:51:31.997 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:51:31.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:31.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:51:32.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:33.081 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:33.082+0000 7fc805504780 -1 Falling back to public interface 2026-03-08T22:51:33.220 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:51:33.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:33.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:33.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:51:33.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:33.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:51:33.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:33.936 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:33.937+0000 7fc805504780 -1 osd.3 0 log_to_monitors true 2026-03-08T22:51:34.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:34.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:34.449 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:51:34.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:51:34.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:34.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:51:34.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:34.944 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:34.945+0000 7fc800ca3640 -1 osd.3 0 waiting for initial osdmap 2026-03-08T22:51:35.698 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:51:35.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:35.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:35.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:51:35.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:35.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:51:35.928 INFO:tasks.workunit.client.0.vm04.stdout:osd.3 up in weight 1 up_from 27 up_thru 28 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/4197309653,v1:127.0.0.1:6827/4197309653] [v2:127.0.0.1:6828/4197309653,v1:127.0.0.1:6829/4197309653] exists,up e26a16f8-b1b1-4fdb-a09f-5592d518f9d5 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 4 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=4 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/4 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 ' 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/4' 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/4/journal' 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:51:35.929 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:51:35.930 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:35.930 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:51:35.930 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:51:35.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:51:35.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:51:35.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:51:35.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:51:35.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:51:35.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:51:35.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:51:35.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:51:35.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:51:35.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:51:35.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:51:35.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:51:35.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/4 2026-03-08T22:51:35.932 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:51:35.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=1b2081c6-d346-4897-a5ad-e2a1975cfa4b 2026-03-08T22:51:35.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd4 1b2081c6-d346-4897-a5ad-e2a1975cfa4b' 2026-03-08T22:51:35.933 INFO:tasks.workunit.client.0.vm04.stdout:add osd4 1b2081c6-d346-4897-a5ad-e2a1975cfa4b 2026-03-08T22:51:35.933 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:51:35.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQB3/a1p8WGjOBAAD6nlkEuSSAlnQL7yYHDeSQ== 2026-03-08T22:51:35.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQB3/a1p8WGjOBAAD6nlkEuSSAlnQL7yYHDeSQ=="}' 2026-03-08T22:51:35.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 1b2081c6-d346-4897-a5ad-e2a1975cfa4b -i td/test-erasure-eio/4/new.json 2026-03-08T22:51:36.180 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-08T22:51:36.190 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/4/new.json 2026-03-08T22:51:36.191 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 4 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/4 --osd-journal=td/test-erasure-eio/4/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQB3/a1p8WGjOBAAD6nlkEuSSAlnQL7yYHDeSQ== --osd-uuid 1b2081c6-d346-4897-a5ad-e2a1975cfa4b 2026-03-08T22:51:36.212 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:36.213+0000 7fac5400d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:36.214 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:36.215+0000 7fac5400d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:36.215 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:36.217+0000 7fac5400d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:36.216 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:36.217+0000 7fac5400d780 -1 bdev(0x55848e90dc00 td/test-erasure-eio/4/block) open stat got: (1) Operation not permitted 2026-03-08T22:51:36.216 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:36.217+0000 7fac5400d780 -1 bluestore(td/test-erasure-eio/4) _read_fsid unparsable uuid 2026-03-08T22:51:38.610 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/4/keyring 2026-03-08T22:51:38.611 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:51:38.611 INFO:tasks.workunit.client.0.vm04.stdout:adding osd4 key to auth repository 2026-03-08T22:51:38.611 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd4 key to auth repository 2026-03-08T22:51:38.611 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/4/keyring auth add osd.4 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:51:38.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.4 2026-03-08T22:51:38.920 INFO:tasks.workunit.client.0.vm04.stdout:start osd.4 2026-03-08T22:51:38.920 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 4 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/4 --osd-journal=td/test-erasure-eio/4/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:51:38.920 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:51:38.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:51:38.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:51:38.943 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:38.942+0000 7f5da020d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:38.944 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:38.945+0000 7f5da020d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:38.946 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:38.946+0000 7f5da020d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:39.147 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 4 2026-03-08T22:51:39.147 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:51:39.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=4 2026-03-08T22:51:39.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:51:39.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:51:39.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:39.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:51:39.148 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:51:39.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:39.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:51:39.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:39.754 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:39.754+0000 7f5da020d780 -1 Falling back to public interface 2026-03-08T22:51:40.377 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:51:40.377 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:40.377 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:40.377 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:51:40.377 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:40.377 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:51:40.606 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:40.683 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:40.684+0000 7f5da020d780 -1 osd.4 0 log_to_monitors true 2026-03-08T22:51:41.608 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:51:41.609 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:41.609 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:41.609 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:51:41.609 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:41.609 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:51:41.867 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:42.870 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:51:42.870 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:42.870 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:42.870 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:51:42.870 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:42.870 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:51:43.090 INFO:tasks.workunit.client.0.vm04.stdout:osd.4 up in weight 1 up_from 36 up_thru 39 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6834/334019813,v1:127.0.0.1:6835/334019813] [v2:127.0.0.1:6836/334019813,v1:127.0.0.1:6837/334019813] exists,up 1b2081c6-d346-4897-a5ad-e2a1975cfa4b 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 5 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=5 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/5 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 ' 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/5' 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/5/journal' 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:51:43.091 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:51:43.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:51:43.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:51:43.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:51:43.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:51:43.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:51:43.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:51:43.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:51:43.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:51:43.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:51:43.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:51:43.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:51:43.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:51:43.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/5 2026-03-08T22:51:43.093 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:51:43.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=9b6cda8e-2bc2-4b21-90d2-23c950c9e695 2026-03-08T22:51:43.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd5 9b6cda8e-2bc2-4b21-90d2-23c950c9e695' 2026-03-08T22:51:43.094 INFO:tasks.workunit.client.0.vm04.stdout:add osd5 9b6cda8e-2bc2-4b21-90d2-23c950c9e695 2026-03-08T22:51:43.094 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:51:43.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQB//a1pzwqhBhAAPuuvu+5+V2nteKKdSTlcvA== 2026-03-08T22:51:43.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQB//a1pzwqhBhAAPuuvu+5+V2nteKKdSTlcvA=="}' 2026-03-08T22:51:43.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 9b6cda8e-2bc2-4b21-90d2-23c950c9e695 -i td/test-erasure-eio/5/new.json 2026-03-08T22:51:43.324 INFO:tasks.workunit.client.0.vm04.stdout:5 2026-03-08T22:51:43.333 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/5/new.json 2026-03-08T22:51:43.334 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 5 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/5 --osd-journal=td/test-erasure-eio/5/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQB//a1pzwqhBhAAPuuvu+5+V2nteKKdSTlcvA== --osd-uuid 9b6cda8e-2bc2-4b21-90d2-23c950c9e695 2026-03-08T22:51:43.355 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:43.355+0000 7f48f8f4d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:43.357 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:43.358+0000 7f48f8f4d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:43.358 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:43.359+0000 7f48f8f4d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:43.358 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:43.359+0000 7f48f8f4d780 -1 bdev(0x55d8407b7c00 td/test-erasure-eio/5/block) open stat got: (1) Operation not permitted 2026-03-08T22:51:43.358 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:43.359+0000 7f48f8f4d780 -1 bluestore(td/test-erasure-eio/5) _read_fsid unparsable uuid 2026-03-08T22:51:45.497 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/5/keyring 2026-03-08T22:51:45.497 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:51:45.497 INFO:tasks.workunit.client.0.vm04.stdout:adding osd5 key to auth repository 2026-03-08T22:51:45.498 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd5 key to auth repository 2026-03-08T22:51:45.498 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/5/keyring auth add osd.5 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:51:45.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.5 2026-03-08T22:51:45.788 INFO:tasks.workunit.client.0.vm04.stdout:start osd.5 2026-03-08T22:51:45.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 5 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/5 --osd-journal=td/test-erasure-eio/5/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:51:45.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:51:45.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:51:45.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:51:45.813 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:45.813+0000 7f8d94a22780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:45.815 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:45.816+0000 7f8d94a22780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:45.817 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:45.817+0000 7f8d94a22780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:46.017 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 5 2026-03-08T22:51:46.017 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:51:46.018 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:51:46.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=5 2026-03-08T22:51:46.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:51:46.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:51:46.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:46.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:51:46.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:46.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:51:46.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:46.902 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:46.903+0000 7f8d94a22780 -1 Falling back to public interface 2026-03-08T22:51:47.239 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:47.239 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:47.239 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:51:47.239 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:51:47.239 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:47.239 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:51:47.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:48.026 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:48.027+0000 7f8d94a22780 -1 osd.5 0 log_to_monitors true 2026-03-08T22:51:48.459 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:51:48.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:48.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:48.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:51:48.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:51:48.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:48.722 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:49.500 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:49.500+0000 7f8d8fb96640 -1 osd.5 0 waiting for initial osdmap 2026-03-08T22:51:49.725 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:51:49.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:49.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:49.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:51:49.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:49.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:51:49.949 INFO:tasks.workunit.client.0.vm04.stdout:osd.5 up in weight 1 up_from 46 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6842/1178279984,v1:127.0.0.1:6843/1178279984] [v2:127.0.0.1:6844/1178279984,v1:127.0.0.1:6845/1178279984] exists,up 9b6cda8e-2bc2-4b21-90d2-23c950c9e695 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 6 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=6 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/6 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 ' 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/6' 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/6/journal' 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:51:49.950 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:51:49.951 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:51:49.951 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:49.951 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:51:49.951 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:51:49.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:51:49.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:51:49.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:51:49.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:51:49.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:51:49.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:51:49.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:51:49.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:51:49.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:51:49.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:51:49.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:51:49.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:51:49.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/6 2026-03-08T22:51:49.953 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:51:49.954 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=1cf421b2-25c5-4cc2-9923-4f5e43c53b04 2026-03-08T22:51:49.954 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd6 1cf421b2-25c5-4cc2-9923-4f5e43c53b04' 2026-03-08T22:51:49.954 INFO:tasks.workunit.client.0.vm04.stdout:add osd6 1cf421b2-25c5-4cc2-9923-4f5e43c53b04 2026-03-08T22:51:49.954 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:51:49.970 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCF/a1pXJ3eORAASwRua43e/sFO1mP92WXwBA== 2026-03-08T22:51:49.970 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCF/a1pXJ3eORAASwRua43e/sFO1mP92WXwBA=="}' 2026-03-08T22:51:49.970 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 1cf421b2-25c5-4cc2-9923-4f5e43c53b04 -i td/test-erasure-eio/6/new.json 2026-03-08T22:51:50.201 INFO:tasks.workunit.client.0.vm04.stdout:6 2026-03-08T22:51:50.213 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/6/new.json 2026-03-08T22:51:50.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 6 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/6 --osd-journal=td/test-erasure-eio/6/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCF/a1pXJ3eORAASwRua43e/sFO1mP92WXwBA== --osd-uuid 1cf421b2-25c5-4cc2-9923-4f5e43c53b04 2026-03-08T22:51:50.236 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:50.237+0000 7fadef69b780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:50.238 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:50.239+0000 7fadef69b780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:50.239 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:50.240+0000 7fadef69b780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:50.239 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:50.240+0000 7fadef69b780 -1 bdev(0x557ba9d29c00 td/test-erasure-eio/6/block) open stat got: (1) Operation not permitted 2026-03-08T22:51:50.240 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:50.240+0000 7fadef69b780 -1 bluestore(td/test-erasure-eio/6) _read_fsid unparsable uuid 2026-03-08T22:51:53.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/6/keyring 2026-03-08T22:51:53.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:51:53.124 INFO:tasks.workunit.client.0.vm04.stdout:adding osd6 key to auth repository 2026-03-08T22:51:53.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd6 key to auth repository 2026-03-08T22:51:53.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/6/keyring auth add osd.6 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:51:53.436 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.6 2026-03-08T22:51:53.436 INFO:tasks.workunit.client.0.vm04.stdout:start osd.6 2026-03-08T22:51:53.437 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 6 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/6 --osd-journal=td/test-erasure-eio/6/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:51:53.437 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:51:53.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:51:53.439 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:51:53.459 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:53.460+0000 7f9cf46ba780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:53.466 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:53.467+0000 7f9cf46ba780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:53.467 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:53.468+0000 7f9cf46ba780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:53.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 6 2026-03-08T22:51:53.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:51:53.670 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=6 2026-03-08T22:51:53.670 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:51:53.670 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:51:53.670 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:53.670 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:51:53.670 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:51:53.670 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:53.670 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:51:53.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:54.271 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:54.272+0000 7f9cf46ba780 -1 Falling back to public interface 2026-03-08T22:51:54.927 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:54.927 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:54.927 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:51:54.927 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:51:54.928 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:54.928 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:51:55.123 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:55.124+0000 7f9cf46ba780 -1 osd.6 0 log_to_monitors true 2026-03-08T22:51:55.152 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:56.155 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:51:56.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:56.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:56.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:51:56.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:56.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:51:56.392 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:56.783 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:51:56.784+0000 7f9cefe59640 -1 osd.6 0 waiting for initial osdmap 2026-03-08T22:51:57.394 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:51:57.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:57.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:57.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:51:57.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:57.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:51:57.619 INFO:tasks.workunit.client.0.vm04.stdout:osd.6 up in weight 1 up_from 53 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6850/862000159,v1:127.0.0.1:6851/862000159] [v2:127.0.0.1:6852/862000159,v1:127.0.0.1:6853/862000159] exists,up 1cf421b2-25c5-4cc2-9923-4f5e43c53b04 2026-03-08T22:51:57.619 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:51:57.619 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:51:57.619 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:51:57.619 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: get_asok_path osd.0 2026-03-08T22:51:57.620 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:51:57.620 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:51:57.620 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:51:57.620 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:57.620 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:51:57.620 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.0.asok 2026-03-08T22:51:57.620 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: CEPH_ARGS= 2026-03-08T22:51:57.620 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.0.asok log flush 2026-03-08T22:51:57.673 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:51:57.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:56: setup_osds: grep 'load: jerasure.*lrc' td/test-erasure-eio/osd.0.log 2026-03-08T22:51:57.681 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:51:12.817+0000 7fd3da519780 0 load: jerasure load: lrc 2026-03-08T22:51:57.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:486: TEST_ec_recovery_multiple_objects_eio: CEPH_ARGS='--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:51:57.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:488: TEST_ec_recovery_multiple_objects_eio: local poolname=pool-jerasure 2026-03-08T22:51:57.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:489: TEST_ec_recovery_multiple_objects_eio: create_erasure_coded_pool pool-jerasure 3 2 2026-03-08T22:51:57.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:67: create_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:51:57.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:68: create_erasure_coded_pool: shift 2026-03-08T22:51:57.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:69: create_erasure_coded_pool: local k=3 2026-03-08T22:51:57.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:70: create_erasure_coded_pool: shift 2026-03-08T22:51:57.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:71: create_erasure_coded_pool: local m=2 2026-03-08T22:51:57.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:72: create_erasure_coded_pool: shift 2026-03-08T22:51:57.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:74: create_erasure_coded_pool: ceph osd erasure-code-profile set myprofile plugin=jerasure k=3 m=2 crush-failure-domain=osd 2026-03-08T22:51:57.973 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:78: create_erasure_coded_pool: create_pool pool-jerasure 1 1 erasure myprofile 2026-03-08T22:51:57.973 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create pool-jerasure 1 1 erasure myprofile 2026-03-08T22:51:58.359 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' already exists 2026-03-08T22:51:58.370 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:51:59.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:80: create_erasure_coded_pool: wait_for_clean 2026-03-08T22:51:59.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:51:59.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:51:59.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:51:59.372 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:51:59.372 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:51:59.372 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:51:59.372 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:51:59.372 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:51:59.372 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:51:59.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:51:59.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:51:59.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:51:59.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:51:59.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:51:59.447 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:51:59.733 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:51:59.733 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:51:59.733 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:51:59.733 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:51:59.733 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:51:59.733 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:51:59.733 INFO:tasks.workunit.client.0.vm04.stderr:6' 2026-03-08T22:51:59.733 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:51:59.733 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:59.734 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:51:59.807 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803787 2026-03-08T22:51:59.807 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803787 2026-03-08T22:51:59.807 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787' 2026-03-08T22:51:59.807 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:59.807 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:51:59.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574858 2026-03-08T22:51:59.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574858 2026-03-08T22:51:59.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574858' 2026-03-08T22:51:59.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:59.890 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:51:59.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378633 2026-03-08T22:51:59.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378633 2026-03-08T22:51:59.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574858 2-81604378633' 2026-03-08T22:51:59.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:59.976 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:52:00.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964116999 2026-03-08T22:52:00.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964116999 2026-03-08T22:52:00.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574858 2-81604378633 3-115964116999' 2026-03-08T22:52:00.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:00.053 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:52:00.122 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=154618822662 2026-03-08T22:52:00.122 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 154618822662 2026-03-08T22:52:00.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574858 2-81604378633 3-115964116999 4-154618822662' 2026-03-08T22:52:00.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:00.123 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:52:00.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=197568495620 2026-03-08T22:52:00.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 197568495620 2026-03-08T22:52:00.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574858 2-81604378633 3-115964116999 4-154618822662 5-197568495620' 2026-03-08T22:52:00.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:00.202 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T22:52:00.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=227633266691 2026-03-08T22:52:00.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 227633266691 2026-03-08T22:52:00.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574858 2-81604378633 3-115964116999 4-154618822662 5-197568495620 6-227633266691' 2026-03-08T22:52:00.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:00.275 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803787 2026-03-08T22:52:00.275 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:00.276 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:52:00.277 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803787 2026-03-08T22:52:00.277 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:00.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803787 2026-03-08T22:52:00.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803787' 2026-03-08T22:52:00.278 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803787 2026-03-08T22:52:00.278 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:52:00.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803785 -lt 25769803787 2026-03-08T22:52:00.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:52:01.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:52:01.507 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:52:01.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803788 -lt 25769803787 2026-03-08T22:52:01.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:01.734 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574858 2026-03-08T22:52:01.734 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:01.735 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:52:01.736 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574858 2026-03-08T22:52:01.736 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:01.737 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574858 2026-03-08T22:52:01.737 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574858' 2026-03-08T22:52:01.737 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 55834574858 2026-03-08T22:52:01.737 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:52:02.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574858 -lt 55834574858 2026-03-08T22:52:02.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:02.013 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378633 2026-03-08T22:52:02.013 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:02.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:52:02.014 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378633 2026-03-08T22:52:02.014 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:02.015 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378633 2026-03-08T22:52:02.015 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378633' 2026-03-08T22:52:02.015 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 81604378633 2026-03-08T22:52:02.015 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:52:02.304 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378633 -lt 81604378633 2026-03-08T22:52:02.304 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:02.304 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964116999 2026-03-08T22:52:02.304 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:02.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:52:02.306 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964116999 2026-03-08T22:52:02.306 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:02.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964116999 2026-03-08T22:52:02.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964116999' 2026-03-08T22:52:02.307 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 115964116999 2026-03-08T22:52:02.308 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:52:02.530 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964117000 -lt 115964116999 2026-03-08T22:52:02.530 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:02.530 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-154618822662 2026-03-08T22:52:02.531 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:02.532 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:52:02.532 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-154618822662 2026-03-08T22:52:02.532 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:02.533 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=154618822662 2026-03-08T22:52:02.533 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 154618822662' 2026-03-08T22:52:02.533 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.4 seq 154618822662 2026-03-08T22:52:02.534 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:52:02.754 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 154618822662 -lt 154618822662 2026-03-08T22:52:02.754 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:02.754 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-197568495620 2026-03-08T22:52:02.754 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:02.755 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:52:02.756 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-197568495620 2026-03-08T22:52:02.756 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:02.757 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=197568495620 2026-03-08T22:52:02.757 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 197568495620' 2026-03-08T22:52:02.757 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.5 seq 197568495620 2026-03-08T22:52:02.757 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:52:02.975 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 197568495621 -lt 197568495620 2026-03-08T22:52:02.975 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:02.975 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-227633266691 2026-03-08T22:52:02.975 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:02.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T22:52:02.977 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-227633266691 2026-03-08T22:52:02.977 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:02.978 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=227633266691 2026-03-08T22:52:02.978 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 227633266691' 2026-03-08T22:52:02.978 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.6 seq 227633266691 2026-03-08T22:52:02.978 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T22:52:03.193 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 227633266692 -lt 227633266691 2026-03-08T22:52:03.194 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:52:03.194 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:52:03.194 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:52:03.489 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:52:03.489 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:52:03.489 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:52:03.489 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:52:03.489 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:52:03.489 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:52:03.489 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:52:03.489 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:52:03.709 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:52:03.709 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:52:03.709 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:52:03.709 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:52:03.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:52:03.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:52:03.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:52:03.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:491: TEST_ec_recovery_multiple_objects_eio: rados_put td/test-erasure-eio pool-jerasure test1 2026-03-08T22:52:03.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T22:52:03.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T22:52:03.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=test1 2026-03-08T22:52:03.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:52:03.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T22:52:03.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:52:03.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T22:52:03.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:52:03.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T22:52:03.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:52:03.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T22:52:03.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put test1 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:04.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:492: TEST_ec_recovery_multiple_objects_eio: rados_put td/test-erasure-eio pool-jerasure test2 2026-03-08T22:52:04.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T22:52:04.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T22:52:04.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=test2 2026-03-08T22:52:04.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:52:04.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T22:52:04.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:52:04.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T22:52:04.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:52:04.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T22:52:04.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:52:04.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T22:52:04.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put test2 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:04.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:493: TEST_ec_recovery_multiple_objects_eio: rados_put td/test-erasure-eio pool-jerasure test3 2026-03-08T22:52:04.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T22:52:04.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T22:52:04.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=test3 2026-03-08T22:52:04.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:52:04.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T22:52:04.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:52:04.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T22:52:04.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:52:04.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T22:52:04.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:52:04.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T22:52:04.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put test3 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:04.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:496: TEST_ec_recovery_multiple_objects_eio: inject_eio ec data pool-jerasure myobject td/test-erasure-eio 0 2026-03-08T22:52:04.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=ec 2026-03-08T22:52:04.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T22:52:04.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=data 2026-03-08T22:52:04.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T22:52:04.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=pool-jerasure 2026-03-08T22:52:04.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T22:52:04.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=myobject 2026-03-08T22:52:04.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T22:52:04.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/test-erasure-eio 2026-03-08T22:52:04.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T22:52:04.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=0 2026-03-08T22:52:04.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T22:52:04.083 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds pool-jerasure myobject 2026-03-08T22:52:04.083 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:52:04.083 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=myobject 2026-03-08T22:52:04.083 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure myobject 2026-03-08T22:52:04.083 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:52:04.302 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:52:04.302 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:52:04.302 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:52:04.302 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:52:04.302 INFO:tasks.workunit.client.0.vm04.stderr:6' 2026-03-08T22:52:04.302 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 5 1 0 6 2026-03-08T22:52:04.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('3' '5' '1' '0' '6') 2026-03-08T22:52:04.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T22:52:04.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=3 2026-03-08T22:52:04.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' ec '!=' ec ']' 2026-03-08T22:52:04.302 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/test-erasure-eio/3/type 2026-03-08T22:52:04.303 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T22:52:04.303 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 3 bluestore_debug_inject_read_err true 2026-03-08T22:52:04.303 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T22:52:04.303 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=3 2026-03-08T22:52:04.303 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T22:52:04.303 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T22:52:04.304 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T22:52:04.304 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.3 2026-03-08T22:52:04.304 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.3 2026-03-08T22:52:04.304 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.3 ']' 2026-03-08T22:52:04.304 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:52:04.304 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:04.304 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:04.305 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.3.asok 2026-03-08T22:52:04.305 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.102080/ceph-osd.3.asok config set bluestore_debug_inject_read_err true 2026-03-08T22:52:04.359 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T22:52:04.359 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T22:52:04.360 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T22:52:04.360 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.3 2026-03-08T22:52:04.360 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.3 2026-03-08T22:52:04.360 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.3 ']' 2026-03-08T22:52:04.360 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:52:04.360 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:04.360 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:04.361 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.3.asok 2026-03-08T22:52:04.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T22:52:04.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.3.asok injectdataerr pool-jerasure myobject 0 2026-03-08T22:52:04.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:497: TEST_ec_recovery_multiple_objects_eio: ceph osd out 0 2026-03-08T22:52:04.662 INFO:tasks.workunit.client.0.vm04.stderr:osd.0 is already out. 2026-03-08T22:52:04.674 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:500: TEST_ec_recovery_multiple_objects_eio: wait_for_clean 2026-03-08T22:52:04.674 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:52:04.674 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:52:04.674 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:52:04.674 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:52:04.674 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:52:04.674 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:52:04.675 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:52:04.675 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:52:04.675 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:52:04.756 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:52:04.756 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:52:04.756 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:52:04.756 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:52:04.756 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:52:04.756 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:52:04.981 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:52:04.981 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:52:04.981 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:52:04.981 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:52:04.981 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:52:04.981 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:52:04.981 INFO:tasks.workunit.client.0.vm04.stderr:6' 2026-03-08T22:52:04.981 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:52:04.981 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:04.982 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:52:05.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803791 2026-03-08T22:52:05.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803791 2026-03-08T22:52:05.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803791' 2026-03-08T22:52:05.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:05.057 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:52:05.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574861 2026-03-08T22:52:05.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574861 2026-03-08T22:52:05.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803791 1-55834574861' 2026-03-08T22:52:05.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:05.132 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:52:05.207 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378636 2026-03-08T22:52:05.207 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378636 2026-03-08T22:52:05.207 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803791 1-55834574861 2-81604378636' 2026-03-08T22:52:05.207 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:05.207 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:52:05.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964117003 2026-03-08T22:52:05.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964117003 2026-03-08T22:52:05.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803791 1-55834574861 2-81604378636 3-115964117003' 2026-03-08T22:52:05.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:05.279 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:52:05.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=154618822665 2026-03-08T22:52:05.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 154618822665 2026-03-08T22:52:05.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803791 1-55834574861 2-81604378636 3-115964117003 4-154618822665' 2026-03-08T22:52:05.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:05.348 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:52:05.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=197568495624 2026-03-08T22:52:05.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 197568495624 2026-03-08T22:52:05.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803791 1-55834574861 2-81604378636 3-115964117003 4-154618822665 5-197568495624' 2026-03-08T22:52:05.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:05.418 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T22:52:05.489 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=227633266694 2026-03-08T22:52:05.489 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 227633266694 2026-03-08T22:52:05.489 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803791 1-55834574861 2-81604378636 3-115964117003 4-154618822665 5-197568495624 6-227633266694' 2026-03-08T22:52:05.489 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:05.489 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803791 2026-03-08T22:52:05.489 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:05.490 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:52:05.491 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803791 2026-03-08T22:52:05.491 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:05.491 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803791 2026-03-08T22:52:05.491 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803791 2026-03-08T22:52:05.491 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803791' 2026-03-08T22:52:05.492 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:52:05.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803788 -lt 25769803791 2026-03-08T22:52:05.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:52:06.700 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:52:06.700 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:52:06.911 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803791 -lt 25769803791 2026-03-08T22:52:06.911 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:06.911 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574861 2026-03-08T22:52:06.911 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:06.912 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:52:06.913 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574861 2026-03-08T22:52:06.913 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:06.913 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574861 2026-03-08T22:52:06.914 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574861' 2026-03-08T22:52:06.914 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 55834574861 2026-03-08T22:52:06.914 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:52:07.139 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574862 -lt 55834574861 2026-03-08T22:52:07.139 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:07.139 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378636 2026-03-08T22:52:07.139 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:07.140 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:52:07.141 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378636 2026-03-08T22:52:07.141 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:07.142 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378636 2026-03-08T22:52:07.142 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378636' 2026-03-08T22:52:07.142 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 81604378636 2026-03-08T22:52:07.142 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:52:07.359 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378636 -lt 81604378636 2026-03-08T22:52:07.359 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:07.359 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964117003 2026-03-08T22:52:07.360 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:07.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:52:07.361 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964117003 2026-03-08T22:52:07.361 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:07.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964117003 2026-03-08T22:52:07.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964117003' 2026-03-08T22:52:07.362 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 115964117003 2026-03-08T22:52:07.363 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:52:07.578 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964117003 -lt 115964117003 2026-03-08T22:52:07.578 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:07.579 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-154618822665 2026-03-08T22:52:07.579 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:07.580 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:52:07.580 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-154618822665 2026-03-08T22:52:07.580 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:07.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=154618822665 2026-03-08T22:52:07.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 154618822665' 2026-03-08T22:52:07.582 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.4 seq 154618822665 2026-03-08T22:52:07.582 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:52:07.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 154618822665 -lt 154618822665 2026-03-08T22:52:07.795 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:07.795 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-197568495624 2026-03-08T22:52:07.795 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:07.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:52:07.796 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-197568495624 2026-03-08T22:52:07.796 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:07.797 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=197568495624 2026-03-08T22:52:07.797 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 197568495624' 2026-03-08T22:52:07.797 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.5 seq 197568495624 2026-03-08T22:52:07.797 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:52:08.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 197568495624 -lt 197568495624 2026-03-08T22:52:08.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:08.010 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-227633266694 2026-03-08T22:52:08.010 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:08.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T22:52:08.012 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-227633266694 2026-03-08T22:52:08.012 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:08.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=227633266694 2026-03-08T22:52:08.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 227633266694' 2026-03-08T22:52:08.013 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.6 seq 227633266694 2026-03-08T22:52:08.013 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T22:52:08.224 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 227633266694 -lt 227633266694 2026-03-08T22:52:08.225 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:52:08.225 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:52:08.225 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:52:08.512 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:52:08.512 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:52:08.512 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:52:08.512 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:52:08.512 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:52:08.512 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:52:08.512 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:52:08.512 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:52:08.726 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:52:08.726 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:52:08.727 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:52:08.727 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:52:09.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:52:09.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:52:09.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:52:09.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:502: TEST_ec_recovery_multiple_objects_eio: rados_get td/test-erasure-eio pool-jerasure test1 2026-03-08T22:52:09.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:52:09.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:52:09.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=test1 2026-03-08T22:52:09.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=ok 2026-03-08T22:52:09.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' ok = fail ']' 2026-03-08T22:52:09.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:120: rados_get: rados --pool pool-jerasure get test1 td/test-erasure-eio/COPY 2026-03-08T22:52:09.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:121: rados_get: diff td/test-erasure-eio/ORIGINAL td/test-erasure-eio/COPY 2026-03-08T22:52:09.046 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:122: rados_get: rm td/test-erasure-eio/COPY 2026-03-08T22:52:09.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:503: TEST_ec_recovery_multiple_objects_eio: rados_get td/test-erasure-eio pool-jerasure test2 2026-03-08T22:52:09.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:52:09.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:52:09.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=test2 2026-03-08T22:52:09.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=ok 2026-03-08T22:52:09.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' ok = fail ']' 2026-03-08T22:52:09.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:120: rados_get: rados --pool pool-jerasure get test2 td/test-erasure-eio/COPY 2026-03-08T22:52:09.073 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:121: rados_get: diff td/test-erasure-eio/ORIGINAL td/test-erasure-eio/COPY 2026-03-08T22:52:09.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:122: rados_get: rm td/test-erasure-eio/COPY 2026-03-08T22:52:09.075 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:504: TEST_ec_recovery_multiple_objects_eio: rados_get td/test-erasure-eio pool-jerasure test3 2026-03-08T22:52:09.075 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:52:09.075 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:52:09.075 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=test3 2026-03-08T22:52:09.075 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=ok 2026-03-08T22:52:09.075 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' ok = fail ']' 2026-03-08T22:52:09.075 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:120: rados_get: rados --pool pool-jerasure get test3 td/test-erasure-eio/COPY 2026-03-08T22:52:09.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:121: rados_get: diff td/test-erasure-eio/ORIGINAL td/test-erasure-eio/COPY 2026-03-08T22:52:09.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:122: rados_get: rm td/test-erasure-eio/COPY 2026-03-08T22:52:09.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:506: TEST_ec_recovery_multiple_objects_eio: delete_erasure_coded_pool pool-jerasure 2026-03-08T22:52:09.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:84: delete_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:52:09.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:85: delete_erasure_coded_pool: ceph osd pool delete pool-jerasure pool-jerasure --yes-i-really-really-mean-it 2026-03-08T22:52:09.362 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' does not exist 2026-03-08T22:52:09.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:86: delete_erasure_coded_pool: ceph osd erasure-code-profile rm myprofile 2026-03-08T22:52:09.644 INFO:tasks.workunit.client.0.vm04.stderr:erasure-code-profile myprofile does not exist 2026-03-08T22:52:09.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:42: run: teardown td/test-erasure-eio 2026-03-08T22:52:09.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:52:09.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:52:09.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:52:09.654 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:52:09.654 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:52:09.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:52:09.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:52:09.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:52:09.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:52:09.790 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:52:09.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:52:09.792 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:52:09.792 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:52:09.792 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:52:09.793 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:52:09.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:52:09.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:52:09.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:52:09.794 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:52:09.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:52:09.795 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:52:09.795 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:52:09.821 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:52:09.821 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:09.821 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:09.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:52:09.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:52:09.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:52:09.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:32: run: for func in $funcs 2026-03-08T22:52:09.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:33: run: setup td/test-erasure-eio 2026-03-08T22:52:09.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-eio 2026-03-08T22:52:09.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-eio 2026-03-08T22:52:09.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:52:09.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:52:09.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:52:09.823 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:52:09.823 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:52:09.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:52:09.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:52:09.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:52:09.825 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:52:09.826 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:52:09.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:52:09.827 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:52:09.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:52:09.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:52:09.828 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:52:09.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:52:09.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:52:09.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:52:09.829 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:52:09.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:52:09.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:52:09.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:52:09.831 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:52:09.831 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:09.831 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:09.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:52:09.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:52:09.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:52:09.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-eio 2026-03-08T22:52:09.833 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:52:09.834 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:09.834 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:09.834 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.102080 2026-03-08T22:52:09.835 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:52:09.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:52:09.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:52:09.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-eio 1' TERM HUP INT 2026-03-08T22:52:09.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:34: run: run_mon td/test-erasure-eio a 2026-03-08T22:52:09.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-eio 2026-03-08T22:52:09.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:52:09.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:52:09.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:52:09.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-eio/a 2026-03-08T22:52:09.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-eio/a --run-dir=td/test-erasure-eio 2026-03-08T22:52:09.862 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:52:09.862 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:52:09.862 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:52:09.862 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:52:09.862 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:09.862 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:09.862 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:52:09.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-eio/a '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-eio/log --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:52:09.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:52:09.896 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:52:09.896 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:52:09.896 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:52:09.896 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:52:09.896 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:52:09.898 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:52:09.898 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:52:09.898 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:52:09.898 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:52:09.898 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:09.898 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:09.899 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:52:09.899 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:52:09.899 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get fsid 2026-03-08T22:52:09.951 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:52:09.951 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:52:09.951 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:52:09.951 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:52:09.951 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:52:09.951 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:52:09.951 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:52:09.951 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:52:09.952 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:52:09.952 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:09.952 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:09.952 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:52:09.952 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:52:09.952 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get mon_host 2026-03-08T22:52:10.004 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:35: run: run_mgr td/test-erasure-eio x 2026-03-08T22:52:10.004 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-eio 2026-03-08T22:52:10.004 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:52:10.004 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:52:10.004 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:52:10.004 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-eio/x 2026-03-08T22:52:10.004 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:52:10.117 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:52:10.117 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:52:10.117 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:52:10.117 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:52:10.117 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:10.117 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:10.117 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:52:10.118 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:52:10.119 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-eio/x '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:52:10.139 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:36: run: create_pool rbd 4 2026-03-08T22:52:10.139 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:52:10.253 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd' already exists 2026-03-08T22:52:10.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:52:11.272 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: get_asok_path mon.a 2026-03-08T22:52:11.272 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:52:11.272 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:52:11.272 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:52:11.272 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:11.272 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:11.272 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:52:11.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: CEPH_ARGS= 2026-03-08T22:52:11.276 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-mon.a.asok log flush 2026-03-08T22:52:11.318 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:52:11.324 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:40: run: grep 'load: jerasure.*lrc' td/test-erasure-eio/mon.a.log 2026-03-08T22:52:11.325 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:52:09.886+0000 7f154adccd80 0 load: jerasure load: lrc 2026-03-08T22:52:11.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:41: run: TEST_ec_recovery_unfound td/test-erasure-eio 2026-03-08T22:52:11.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:609: TEST_ec_recovery_unfound: local dir=td/test-erasure-eio 2026-03-08T22:52:11.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:610: TEST_ec_recovery_unfound: local objname=myobject 2026-03-08T22:52:11.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:611: TEST_ec_recovery_unfound: local lastobj=100 2026-03-08T22:52:11.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:613: TEST_ec_recovery_unfound: local testobj=obj75 2026-03-08T22:52:11.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:615: TEST_ec_recovery_unfound: ORIG_ARGS='--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:52:11.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:616: TEST_ec_recovery_unfound: CEPH_ARGS+=' --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 ' 2026-03-08T22:52:11.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:617: TEST_ec_recovery_unfound: CEPH_ARGS+=' --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10' 2026-03-08T22:52:11.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:618: TEST_ec_recovery_unfound: setup_osds 5 2026-03-08T22:52:11.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:47: setup_osds: local count=5 2026-03-08T22:52:11.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:48: setup_osds: shift 2026-03-08T22:52:11.326 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: expr 5 - 1 2026-03-08T22:52:11.327 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: seq 0 4 2026-03-08T22:52:11.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:52:11.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 0 2026-03-08T22:52:11.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:52:11.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:52:11.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:52:11.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:52:11.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/0 2026-03-08T22:52:11.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10' 2026-03-08T22:52:11.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:52:11.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:52:11.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:52:11.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/0' 2026-03-08T22:52:11.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/0/journal' 2026-03-08T22:52:11.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:52:11.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:52:11.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:52:11.329 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:52:11.329 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:52:11.329 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:52:11.329 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:52:11.330 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:11.330 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:11.330 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:52:11.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:52:11.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:52:11.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:52:11.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:52:11.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:52:11.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:52:11.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:52:11.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:52:11.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:52:11.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:52:11.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:52:11.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:52:11.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/0 2026-03-08T22:52:11.332 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:52:11.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=56859e55-329d-42ba-84c7-de931c34f3da 2026-03-08T22:52:11.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 56859e55-329d-42ba-84c7-de931c34f3da' 2026-03-08T22:52:11.332 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 56859e55-329d-42ba-84c7-de931c34f3da 2026-03-08T22:52:11.333 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:52:11.350 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCb/a1pXb7qFBAAUdFwE3d91ek9EJPXlis+jQ== 2026-03-08T22:52:11.350 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCb/a1pXb7qFBAAUdFwE3d91ek9EJPXlis+jQ=="}' 2026-03-08T22:52:11.350 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 56859e55-329d-42ba-84c7-de931c34f3da -i td/test-erasure-eio/0/new.json 2026-03-08T22:52:11.485 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:52:11.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/0/new.json 2026-03-08T22:52:11.494 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCb/a1pXb7qFBAAUdFwE3d91ek9EJPXlis+jQ== --osd-uuid 56859e55-329d-42ba-84c7-de931c34f3da 2026-03-08T22:52:11.522 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:11.522+0000 7fabc414d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:11.528 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:11.529+0000 7fabc414d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:11.530 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:11.531+0000 7fabc414d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:11.530 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:11.531+0000 7fabc414d780 -1 bdev(0x5638c4cba800 td/test-erasure-eio/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:52:11.530 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:11.531+0000 7fabc414d780 -1 bluestore(td/test-erasure-eio/0) _read_fsid unparsable uuid 2026-03-08T22:52:13.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/0/keyring 2026-03-08T22:52:13.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:52:13.670 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:52:13.670 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:52:13.670 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:52:13.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:52:13.972 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:52:13.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:52:13.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:52:13.973 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:52:13.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:52:13.996 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:13.996+0000 7f2e77293780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:14.001 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:14.002+0000 7f2e77293780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:14.004 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:14.004+0000 7f2e77293780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:14.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:52:14.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:52:14.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:52:14.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:52:14.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:52:14.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:14.219 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:52:14.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:52:14.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:14.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:52:14.458 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:15.323 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:15.324+0000 7f2e77293780 -1 Falling back to public interface 2026-03-08T22:52:15.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:15.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:15.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:52:15.459 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:52:15.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:15.460 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:52:15.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:16.218 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:16.219+0000 7f2e77293780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:52:16.695 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:52:16.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:16.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:16.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:52:16.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:52:16.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:16.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:17.936 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:52:17.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:17.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:17.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:52:17.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:17.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:52:18.159 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 6 up_thru 6 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/2409941212,v1:127.0.0.1:6803/2409941212] [v2:127.0.0.1:6804/2409941212,v1:127.0.0.1:6805/2409941212] exists,up 56859e55-329d-42ba-84c7-de931c34f3da 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 1 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10' 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:52:18.160 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:52:18.161 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:52:18.161 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:18.161 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:18.161 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:52:18.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:52:18.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:52:18.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:52:18.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:52:18.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:52:18.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:52:18.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:52:18.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:52:18.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:52:18.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:52:18.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:52:18.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:52:18.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T22:52:18.162 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:52:18.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=5525990e-a9bb-45e9-823b-19f02c24ea05 2026-03-08T22:52:18.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 5525990e-a9bb-45e9-823b-19f02c24ea05' 2026-03-08T22:52:18.163 INFO:tasks.workunit.client.0.vm04.stdout:add osd1 5525990e-a9bb-45e9-823b-19f02c24ea05 2026-03-08T22:52:18.163 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:52:18.181 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCi/a1pB3TmChAAXQLTYYBsUIjZOcctUsEOqA== 2026-03-08T22:52:18.181 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCi/a1pB3TmChAAXQLTYYBsUIjZOcctUsEOqA=="}' 2026-03-08T22:52:18.181 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 5525990e-a9bb-45e9-823b-19f02c24ea05 -i td/test-erasure-eio/1/new.json 2026-03-08T22:52:18.421 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:52:18.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/1/new.json 2026-03-08T22:52:18.433 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCi/a1pB3TmChAAXQLTYYBsUIjZOcctUsEOqA== --osd-uuid 5525990e-a9bb-45e9-823b-19f02c24ea05 2026-03-08T22:52:18.458 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:18.459+0000 7fd4cc58c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:18.460 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:18.461+0000 7fd4cc58c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:18.461 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:18.462+0000 7fd4cc58c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:18.462 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:18.463+0000 7fd4cc58c780 -1 bdev(0x55dfbd7a3c00 td/test-erasure-eio/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:52:18.462 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:18.463+0000 7fd4cc58c780 -1 bluestore(td/test-erasure-eio/1) _read_fsid unparsable uuid 2026-03-08T22:52:20.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/1/keyring 2026-03-08T22:52:20.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:52:20.590 INFO:tasks.workunit.client.0.vm04.stdout:adding osd1 key to auth repository 2026-03-08T22:52:20.590 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:52:20.590 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:52:20.887 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:52:20.887 INFO:tasks.workunit.client.0.vm04.stdout:start osd.1 2026-03-08T22:52:20.887 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:52:20.887 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:52:20.888 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:52:20.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:52:20.916 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:20.917+0000 7fedaea0f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:20.918 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:20.919+0000 7fedaea0f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:20.920 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:20.920+0000 7fedaea0f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:21.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:52:21.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:52:21.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:52:21.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:52:21.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:52:21.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:21.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:52:21.130 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:52:21.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:21.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:52:21.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:22.242 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:22.243+0000 7fedaea0f780 -1 Falling back to public interface 2026-03-08T22:52:22.363 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:22.363 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:22.363 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:52:22.363 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:52:22.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:22.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:52:22.597 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:23.089 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:23.090+0000 7fedaea0f780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:52:23.599 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:52:23.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:23.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:23.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:52:23.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:23.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:52:23.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:24.853 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:52:24.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:24.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:24.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:52:24.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:24.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:52:25.088 INFO:tasks.workunit.client.0.vm04.stdout:osd.1 up in weight 1 up_from 13 up_thru 13 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/4273139771,v1:127.0.0.1:6811/4273139771] [v2:127.0.0.1:6812/4273139771,v1:127.0.0.1:6813/4273139771] exists,up 5525990e-a9bb-45e9-823b-19f02c24ea05 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 2 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10' 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:52:25.089 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:52:25.090 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:52:25.090 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:25.090 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:25.090 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:52:25.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:52:25.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:52:25.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:52:25.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:52:25.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:52:25.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:52:25.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:52:25.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:52:25.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:52:25.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:52:25.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:52:25.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:52:25.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T22:52:25.092 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:52:25.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=e0ece297-25ce-4956-b4fe-0bcc6ff9f3cb 2026-03-08T22:52:25.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 e0ece297-25ce-4956-b4fe-0bcc6ff9f3cb' 2026-03-08T22:52:25.093 INFO:tasks.workunit.client.0.vm04.stdout:add osd2 e0ece297-25ce-4956-b4fe-0bcc6ff9f3cb 2026-03-08T22:52:25.094 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:52:25.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCp/a1pi8qyBhAApqLq2xHv8Rcx5Z0HBT4aFw== 2026-03-08T22:52:25.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCp/a1pi8qyBhAApqLq2xHv8Rcx5Z0HBT4aFw=="}' 2026-03-08T22:52:25.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new e0ece297-25ce-4956-b4fe-0bcc6ff9f3cb -i td/test-erasure-eio/2/new.json 2026-03-08T22:52:25.371 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:52:25.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/2/new.json 2026-03-08T22:52:25.380 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCp/a1pi8qyBhAApqLq2xHv8Rcx5Z0HBT4aFw== --osd-uuid e0ece297-25ce-4956-b4fe-0bcc6ff9f3cb 2026-03-08T22:52:25.406 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:25.407+0000 7f39c742f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:25.408 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:25.409+0000 7f39c742f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:25.408 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:25.409+0000 7f39c742f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:25.409 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:25.410+0000 7f39c742f780 -1 bdev(0x561e44c93c00 td/test-erasure-eio/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:52:25.409 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:25.410+0000 7f39c742f780 -1 bluestore(td/test-erasure-eio/2) _read_fsid unparsable uuid 2026-03-08T22:52:27.537 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/2/keyring 2026-03-08T22:52:27.537 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:52:27.540 INFO:tasks.workunit.client.0.vm04.stdout:adding osd2 key to auth repository 2026-03-08T22:52:27.541 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:52:27.541 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:52:27.846 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T22:52:27.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:52:27.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:52:27.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:52:27.848 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:52:27.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:52:27.875 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:27.874+0000 7f2732022780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:27.876 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:27.877+0000 7f2732022780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:27.877 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:27.878+0000 7f2732022780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:28.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:52:28.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:52:28.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:52:28.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:52:28.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:52:28.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:28.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:52:28.089 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:52:28.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:28.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:52:28.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:28.689 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:28.690+0000 7f2732022780 -1 Falling back to public interface 2026-03-08T22:52:29.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:29.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:29.328 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:52:29.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:52:29.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:29.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:52:29.553 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:29.554+0000 7f2732022780 -1 osd.2 0 log_to_monitors true 2026-03-08T22:52:29.566 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:30.568 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:52:30.569 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:30.569 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:30.569 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:52:30.569 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:30.569 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:52:30.816 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:31.049 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:31.050+0000 7f272d196640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T22:52:31.818 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:52:31.818 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:31.818 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:31.818 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:52:31.818 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:31.818 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:52:32.041 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up in weight 1 up_from 19 up_thru 19 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/4113715797,v1:127.0.0.1:6819/4113715797] [v2:127.0.0.1:6820/4113715797,v1:127.0.0.1:6821/4113715797] exists,up e0ece297-25ce-4956-b4fe-0bcc6ff9f3cb 2026-03-08T22:52:32.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:52:32.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:52:32.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:52:32.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:52:32.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 3 2026-03-08T22:52:32.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:52:32.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:52:32.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T22:52:32.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:52:32.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/3 2026-03-08T22:52:32.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10' 2026-03-08T22:52:32.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:52:32.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:52:32.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:52:32.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/3' 2026-03-08T22:52:32.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/3/journal' 2026-03-08T22:52:32.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:52:32.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:52:32.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:52:32.042 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:52:32.042 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:52:32.042 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:52:32.042 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:52:32.042 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:32.042 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:32.042 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:52:32.042 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:52:32.042 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:52:32.042 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:52:32.042 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:52:32.042 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:52:32.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:52:32.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:52:32.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:52:32.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:52:32.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:52:32.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:52:32.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:52:32.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/3 2026-03-08T22:52:32.044 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:52:32.044 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=1b7b756e-df24-46da-a480-94ea5e8f6ed0 2026-03-08T22:52:32.044 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 1b7b756e-df24-46da-a480-94ea5e8f6ed0' 2026-03-08T22:52:32.044 INFO:tasks.workunit.client.0.vm04.stdout:add osd3 1b7b756e-df24-46da-a480-94ea5e8f6ed0 2026-03-08T22:52:32.045 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:52:32.061 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCw/a1pWeq8AxAApATRibY/8tGgI7WO4We02A== 2026-03-08T22:52:32.061 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCw/a1pWeq8AxAApATRibY/8tGgI7WO4We02A=="}' 2026-03-08T22:52:32.061 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 1b7b756e-df24-46da-a480-94ea5e8f6ed0 -i td/test-erasure-eio/3/new.json 2026-03-08T22:52:32.306 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:52:32.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/3/new.json 2026-03-08T22:52:32.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCw/a1pWeq8AxAApATRibY/8tGgI7WO4We02A== --osd-uuid 1b7b756e-df24-46da-a480-94ea5e8f6ed0 2026-03-08T22:52:32.341 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:32.342+0000 7fcf98027780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:32.343 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:32.344+0000 7fcf98027780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:32.344 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:32.345+0000 7fcf98027780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:32.345 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:32.345+0000 7fcf98027780 -1 bdev(0x5571bf545c00 td/test-erasure-eio/3/block) open stat got: (1) Operation not permitted 2026-03-08T22:52:32.345 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:32.345+0000 7fcf98027780 -1 bluestore(td/test-erasure-eio/3) _read_fsid unparsable uuid 2026-03-08T22:52:34.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/3/keyring 2026-03-08T22:52:34.463 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:52:34.463 INFO:tasks.workunit.client.0.vm04.stdout:adding osd3 key to auth repository 2026-03-08T22:52:34.463 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T22:52:34.463 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:52:34.761 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T22:52:34.761 INFO:tasks.workunit.client.0.vm04.stdout:start osd.3 2026-03-08T22:52:34.762 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:52:34.762 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:52:34.763 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:52:34.764 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:52:34.786 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:34.785+0000 7f9c2bd23780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:34.790 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:34.791+0000 7f9c2bd23780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:34.792 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:34.792+0000 7f9c2bd23780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:34.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T22:52:34.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:52:34.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:52:34.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:52:34.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:52:34.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:34.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:52:34.995 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:52:34.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:34.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:52:35.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:35.598 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:35.599+0000 7f9c2bd23780 -1 Falling back to public interface 2026-03-08T22:52:36.237 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:52:36.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:36.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:36.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:52:36.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:36.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:52:36.463 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:36.463+0000 7f9c2bd23780 -1 osd.3 0 log_to_monitors true 2026-03-08T22:52:36.485 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:37.488 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:52:37.488 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:37.488 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:37.488 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:52:37.488 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:37.488 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:52:37.747 INFO:tasks.workunit.client.0.vm04.stdout:osd.3 up in weight 1 up_from 27 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/2583184876,v1:127.0.0.1:6827/2583184876] [v2:127.0.0.1:6828/2583184876,v1:127.0.0.1:6829/2583184876] exists,up 1b7b756e-df24-46da-a480-94ea5e8f6ed0 2026-03-08T22:52:37.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:52:37.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:52:37.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:52:37.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:52:37.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 4 2026-03-08T22:52:37.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:52:37.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:52:37.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=4 2026-03-08T22:52:37.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:52:37.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/4 2026-03-08T22:52:37.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10' 2026-03-08T22:52:37.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:52:37.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:52:37.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:52:37.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/4' 2026-03-08T22:52:37.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/4/journal' 2026-03-08T22:52:37.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:52:37.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:52:37.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:52:37.749 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:52:37.749 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:52:37.749 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:52:37.749 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:52:37.749 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:37.749 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:37.749 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:52:37.749 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:52:37.749 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:52:37.749 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:52:37.749 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:52:37.749 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:52:37.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:52:37.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:52:37.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:52:37.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:52:37.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:52:37.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:52:37.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:52:37.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/4 2026-03-08T22:52:37.751 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:52:37.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=69234f0a-7f2a-4136-a531-b24f74c45ffa 2026-03-08T22:52:37.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd4 69234f0a-7f2a-4136-a531-b24f74c45ffa' 2026-03-08T22:52:37.752 INFO:tasks.workunit.client.0.vm04.stdout:add osd4 69234f0a-7f2a-4136-a531-b24f74c45ffa 2026-03-08T22:52:37.752 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:52:37.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQC1/a1pfvcELhAA3flyuf0jxYGPariEkXX8Cw== 2026-03-08T22:52:37.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQC1/a1pfvcELhAA3flyuf0jxYGPariEkXX8Cw=="}' 2026-03-08T22:52:37.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 69234f0a-7f2a-4136-a531-b24f74c45ffa -i td/test-erasure-eio/4/new.json 2026-03-08T22:52:38.015 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-08T22:52:38.025 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/4/new.json 2026-03-08T22:52:38.026 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 4 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/4 --osd-journal=td/test-erasure-eio/4/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQC1/a1pfvcELhAA3flyuf0jxYGPariEkXX8Cw== --osd-uuid 69234f0a-7f2a-4136-a531-b24f74c45ffa 2026-03-08T22:52:38.054 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:38.055+0000 7f7a93238780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:38.056 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:38.057+0000 7f7a93238780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:38.057 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:38.058+0000 7f7a93238780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:38.058 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:38.058+0000 7f7a93238780 -1 bdev(0x556761201c00 td/test-erasure-eio/4/block) open stat got: (1) Operation not permitted 2026-03-08T22:52:38.058 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:38.058+0000 7f7a93238780 -1 bluestore(td/test-erasure-eio/4) _read_fsid unparsable uuid 2026-03-08T22:52:40.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/4/keyring 2026-03-08T22:52:40.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:52:40.290 INFO:tasks.workunit.client.0.vm04.stdout:adding osd4 key to auth repository 2026-03-08T22:52:40.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd4 key to auth repository 2026-03-08T22:52:40.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/4/keyring auth add osd.4 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:52:40.498 INFO:tasks.workunit.client.0.vm04.stdout:start osd.4 2026-03-08T22:52:40.498 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.4 2026-03-08T22:52:40.498 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 4 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-recovery-max-single-start 3 --osd-recovery-max-active 3 --osd_min_pg_log_entries=5 --osd_max_pg_log_entries=10 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/4 --osd-journal=td/test-erasure-eio/4/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:52:40.498 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:52:40.499 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:52:40.501 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:52:40.530 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:40.529+0000 7fdb5a213780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:40.533 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:40.534+0000 7fdb5a213780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:40.536 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:40.536+0000 7fdb5a213780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:40.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 4 2026-03-08T22:52:40.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:52:40.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=4 2026-03-08T22:52:40.736 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:52:40.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:52:40.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:52:40.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:40.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:52:40.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:40.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:52:40.966 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:41.348 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:41.349+0000 7fdb5a213780 -1 Falling back to public interface 2026-03-08T22:52:41.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:41.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:41.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:52:41.967 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:52:41.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:41.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:52:42.195 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:42.219 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:42.220+0000 7fdb5a213780 -1 osd.4 0 log_to_monitors true 2026-03-08T22:52:43.200 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:52:43.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:43.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:43.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:52:43.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:43.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:52:43.296 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:43.297+0000 7fdb55294640 -1 osd.4 0 waiting for initial osdmap 2026-03-08T22:52:43.470 INFO:tasks.workunit.client.0.vm04.stdout:osd.4 up in weight 1 up_from 36 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6834/2195051951,v1:127.0.0.1:6835/2195051951] [v2:127.0.0.1:6836/2195051951,v1:127.0.0.1:6837/2195051951] exists,up 69234f0a-7f2a-4136-a531-b24f74c45ffa 2026-03-08T22:52:43.471 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:52:43.471 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:52:43.471 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:52:43.474 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: get_asok_path osd.0 2026-03-08T22:52:43.474 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:52:43.474 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:52:43.474 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:52:43.474 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:43.474 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:43.475 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.0.asok 2026-03-08T22:52:43.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: CEPH_ARGS= 2026-03-08T22:52:43.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.0.asok log flush 2026-03-08T22:52:43.529 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:52:43.537 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:56: setup_osds: grep 'load: jerasure.*lrc' td/test-erasure-eio/osd.0.log 2026-03-08T22:52:43.539 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:52:15.328+0000 7f2e77293780 0 load: jerasure load: lrc 2026-03-08T22:52:43.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:619: TEST_ec_recovery_unfound: CEPH_ARGS='--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:52:43.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:621: TEST_ec_recovery_unfound: local poolname=pool-jerasure 2026-03-08T22:52:43.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:622: TEST_ec_recovery_unfound: create_erasure_coded_pool pool-jerasure 3 2 2026-03-08T22:52:43.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:67: create_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:52:43.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:68: create_erasure_coded_pool: shift 2026-03-08T22:52:43.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:69: create_erasure_coded_pool: local k=3 2026-03-08T22:52:43.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:70: create_erasure_coded_pool: shift 2026-03-08T22:52:43.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:71: create_erasure_coded_pool: local m=2 2026-03-08T22:52:43.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:72: create_erasure_coded_pool: shift 2026-03-08T22:52:43.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:74: create_erasure_coded_pool: ceph osd erasure-code-profile set myprofile plugin=jerasure k=3 m=2 crush-failure-domain=osd 2026-03-08T22:52:43.873 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:78: create_erasure_coded_pool: create_pool pool-jerasure 1 1 erasure myprofile 2026-03-08T22:52:43.873 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create pool-jerasure 1 1 erasure myprofile 2026-03-08T22:52:44.276 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' already exists 2026-03-08T22:52:44.287 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:52:45.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:80: create_erasure_coded_pool: wait_for_clean 2026-03-08T22:52:45.289 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:52:45.289 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:52:45.289 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:52:45.289 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:52:45.289 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:52:45.289 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:52:45.289 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:52:45.289 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:52:45.289 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:52:45.368 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:52:45.368 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:52:45.368 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:52:45.368 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:52:45.368 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:52:45.368 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:52:45.616 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:52:45.616 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:52:45.616 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:52:45.616 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:52:45.616 INFO:tasks.workunit.client.0.vm04.stderr:4' 2026-03-08T22:52:45.616 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:52:45.616 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:45.617 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:52:45.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803784 2026-03-08T22:52:45.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803784 2026-03-08T22:52:45.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803784' 2026-03-08T22:52:45.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:45.700 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:52:45.775 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574855 2026-03-08T22:52:45.775 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574855 2026-03-08T22:52:45.775 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803784 1-55834574855' 2026-03-08T22:52:45.775 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:45.775 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:52:45.858 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378629 2026-03-08T22:52:45.858 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378629 2026-03-08T22:52:45.858 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803784 1-55834574855 2-81604378629' 2026-03-08T22:52:45.858 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:45.858 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:52:45.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964116996 2026-03-08T22:52:45.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964116996 2026-03-08T22:52:45.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803784 1-55834574855 2-81604378629 3-115964116996' 2026-03-08T22:52:45.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:45.935 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:52:46.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=154618822659 2026-03-08T22:52:46.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 154618822659 2026-03-08T22:52:46.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803784 1-55834574855 2-81604378629 3-115964116996 4-154618822659' 2026-03-08T22:52:46.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:46.007 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803784 2026-03-08T22:52:46.007 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:46.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:52:46.009 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803784 2026-03-08T22:52:46.009 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:46.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803784 2026-03-08T22:52:46.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803784' 2026-03-08T22:52:46.010 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803784 2026-03-08T22:52:46.010 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:52:46.225 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803782 -lt 25769803784 2026-03-08T22:52:46.226 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:52:47.227 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:52:47.227 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:52:47.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803784 -lt 25769803784 2026-03-08T22:52:47.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:47.463 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574855 2026-03-08T22:52:47.463 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:47.464 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:52:47.464 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574855 2026-03-08T22:52:47.465 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:47.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574855 2026-03-08T22:52:47.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574855' 2026-03-08T22:52:47.466 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 55834574855 2026-03-08T22:52:47.466 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:52:47.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574855 -lt 55834574855 2026-03-08T22:52:47.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:47.692 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378629 2026-03-08T22:52:47.692 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:47.693 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:52:47.694 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378629 2026-03-08T22:52:47.694 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:47.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378629 2026-03-08T22:52:47.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378629' 2026-03-08T22:52:47.695 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 81604378629 2026-03-08T22:52:47.695 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:52:47.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378630 -lt 81604378629 2026-03-08T22:52:47.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:47.932 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964116996 2026-03-08T22:52:47.932 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:47.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:52:47.933 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964116996 2026-03-08T22:52:47.933 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:47.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964116996 2026-03-08T22:52:47.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964116996' 2026-03-08T22:52:47.934 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 115964116996 2026-03-08T22:52:47.934 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:52:48.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964116996 -lt 115964116996 2026-03-08T22:52:48.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:48.178 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-154618822659 2026-03-08T22:52:48.178 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:48.179 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:52:48.180 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-154618822659 2026-03-08T22:52:48.180 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:48.181 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=154618822659 2026-03-08T22:52:48.181 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 154618822659' 2026-03-08T22:52:48.181 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.4 seq 154618822659 2026-03-08T22:52:48.181 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:52:48.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 154618822659 -lt 154618822659 2026-03-08T22:52:48.416 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:52:48.417 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:52:48.417 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:52:48.709 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:52:48.709 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:52:48.710 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:52:48.710 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:52:48.710 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:52:48.710 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:52:48.710 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:52:48.710 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:52:48.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:52:48.931 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:52:48.931 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:52:48.931 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:52:49.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:52:49.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:52:49.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:52:49.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:624: TEST_ec_recovery_unfound: ceph pg dump pgs 2026-03-08T22:52:49.453 INFO:tasks.workunit.client.0.vm04.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:52:49.453 INFO:tasks.workunit.client.0.vm04.stdout:2.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:52:44.325923+0000 0'0 42:15 [3,1,4,0,2] 3 [3,1,4,0,2] 3 0'0 2026-03-08T22:52:44.210362+0000 0'0 2026-03-08T22:52:44.210362+0000 0 0 periodic scrub scheduled @ 2026-03-09T23:50:06.387830+0000 0 0 2026-03-08T22:52:49.453 INFO:tasks.workunit.client.0.vm04.stdout:1.3 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:52:45.007460+0000 0'0 42:11 [4,1,2] 4 [4,1,2] 4 0'0 2026-03-08T22:52:10.235072+0000 0'0 2026-03-08T22:52:10.235072+0000 0 0 periodic scrub scheduled @ 2026-03-10T07:24:53.707853+0000 0 0 2026-03-08T22:52:49.453 INFO:tasks.workunit.client.0.vm04.stdout:1.2 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:52:45.006834+0000 0'0 42:94 [0,4,1] 0 [0,4,1] 0 0'0 2026-03-08T22:52:10.235072+0000 0'0 2026-03-08T22:52:10.235072+0000 0 0 periodic scrub scheduled @ 2026-03-10T06:52:19.099364+0000 0 0 2026-03-08T22:52:49.453 INFO:tasks.workunit.client.0.vm04.stdout:1.1 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:52:44.422580+0000 0'0 42:11 [4,3,0] 4 [4,3,0] 4 0'0 2026-03-08T22:52:10.235072+0000 0'0 2026-03-08T22:52:10.235072+0000 0 0 periodic scrub scheduled @ 2026-03-10T04:42:38.302799+0000 0 0 2026-03-08T22:52:49.453 INFO:tasks.workunit.client.0.vm04.stdout:1.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:52:31.225832+0000 0'0 42:66 [1,0,2] 1 [1,0,2] 1 0'0 2026-03-08T22:52:10.235072+0000 0'0 2026-03-08T22:52:10.235072+0000 0 0 periodic scrub scheduled @ 2026-03-10T05:49:41.708779+0000 0 0 2026-03-08T22:52:49.453 INFO:tasks.workunit.client.0.vm04.stdout: 2026-03-08T22:52:49.453 INFO:tasks.workunit.client.0.vm04.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:52:49.453 INFO:tasks.workunit.client.0.vm04.stderr:dumped pgs 2026-03-08T22:52:49.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:626: TEST_ec_recovery_unfound: rados_put td/test-erasure-eio pool-jerasure myobject 2026-03-08T22:52:49.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T22:52:49.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T22:52:49.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=myobject 2026-03-08T22:52:49.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:52:49.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T22:52:49.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:52:49.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T22:52:49.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:52:49.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T22:52:49.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:52:49.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T22:52:49.467 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put myobject td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:49.683 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:628: TEST_ec_recovery_unfound: get_osds pool-jerasure myobject 2026-03-08T22:52:49.683 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:52:49.683 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=myobject 2026-03-08T22:52:49.684 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure myobject 2026-03-08T22:52:49.684 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:52:49.960 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:52:49.960 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:52:49.960 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:52:49.960 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:52:49.960 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T22:52:49.960 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 4 0 2 2026-03-08T22:52:49.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:628: TEST_ec_recovery_unfound: initial_osds=('3' '1' '4' '0' '2') 2026-03-08T22:52:49.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:628: TEST_ec_recovery_unfound: local -a initial_osds 2026-03-08T22:52:49.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:629: TEST_ec_recovery_unfound: local last_osd=2 2026-03-08T22:52:49.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:630: TEST_ec_recovery_unfound: kill_daemons td/test-erasure-eio TERM osd.2 2026-03-08T22:52:49.960 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:52:49.960 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:52:49.961 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:52:49.961 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:52:49.961 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:52:50.274 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:52:50.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:631: TEST_ec_recovery_unfound: ceph osd down 2 2026-03-08T22:52:50.505 INFO:tasks.workunit.client.0.vm04.stderr:osd.2 is already down. 2026-03-08T22:52:50.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:632: TEST_ec_recovery_unfound: ceph osd out 2 2026-03-08T22:52:50.776 INFO:tasks.workunit.client.0.vm04.stderr:osd.2 is already out. 2026-03-08T22:52:50.787 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:634: TEST_ec_recovery_unfound: ceph pg dump pgs 2026-03-08T22:52:51.004 INFO:tasks.workunit.client.0.vm04.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:52:51.004 INFO:tasks.workunit.client.0.vm04.stdout:2.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:52:44.325923+0000 0'0 42:15 [3,1,4,0,2] 3 [3,1,4,0,2] 3 0'0 2026-03-08T22:52:44.210362+0000 0'0 2026-03-08T22:52:44.210362+0000 0 0 periodic scrub scheduled @ 2026-03-09T23:50:06.387830+0000 0 0 2026-03-08T22:52:51.004 INFO:tasks.workunit.client.0.vm04.stdout:1.3 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:52:45.007460+0000 0'0 42:11 [4,1,2] 4 [4,1,2] 4 0'0 2026-03-08T22:52:10.235072+0000 0'0 2026-03-08T22:52:10.235072+0000 0 0 periodic scrub scheduled @ 2026-03-10T07:24:53.707853+0000 0 0 2026-03-08T22:52:51.004 INFO:tasks.workunit.client.0.vm04.stdout:1.2 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:52:45.006834+0000 0'0 42:94 [0,4,1] 0 [0,4,1] 0 0'0 2026-03-08T22:52:10.235072+0000 0'0 2026-03-08T22:52:10.235072+0000 0 0 periodic scrub scheduled @ 2026-03-10T06:52:19.099364+0000 0 0 2026-03-08T22:52:51.004 INFO:tasks.workunit.client.0.vm04.stdout:1.1 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:52:44.422580+0000 0'0 42:11 [4,3,0] 4 [4,3,0] 4 0'0 2026-03-08T22:52:10.235072+0000 0'0 2026-03-08T22:52:10.235072+0000 0 0 periodic scrub scheduled @ 2026-03-10T04:42:38.302799+0000 0 0 2026-03-08T22:52:51.004 INFO:tasks.workunit.client.0.vm04.stdout:1.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:52:31.225832+0000 0'0 42:66 [1,0,2] 1 [1,0,2] 1 0'0 2026-03-08T22:52:10.235072+0000 0'0 2026-03-08T22:52:10.235072+0000 0 0 periodic scrub scheduled @ 2026-03-10T05:49:41.708779+0000 0 0 2026-03-08T22:52:51.004 INFO:tasks.workunit.client.0.vm04.stdout: 2026-03-08T22:52:51.004 INFO:tasks.workunit.client.0.vm04.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:52:51.004 INFO:tasks.workunit.client.0.vm04.stderr:dumped pgs 2026-03-08T22:52:51.016 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:636: TEST_ec_recovery_unfound: dd if=/dev/urandom of=td/test-erasure-eio/ORIGINAL bs=1024 count=4 2026-03-08T22:52:51.017 INFO:tasks.workunit.client.0.vm04.stderr:4+0 records in 2026-03-08T22:52:51.017 INFO:tasks.workunit.client.0.vm04.stderr:4+0 records out 2026-03-08T22:52:51.017 INFO:tasks.workunit.client.0.vm04.stderr:4096 bytes (4.1 kB, 4.0 KiB) copied, 8.9126e-05 s, 46.0 MB/s 2026-03-08T22:52:51.017 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: seq 1 100 2026-03-08T22:52:51.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj1 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.051 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.051 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj2 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj3 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj4 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.143 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.143 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj5 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.173 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.173 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj6 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj7 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.232 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj8 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj9 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.292 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj10 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj11 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.350 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.350 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj12 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj13 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj14 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.435 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.435 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj15 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj16 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.498 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.499 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj17 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.530 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.531 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj18 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.561 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.562 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj19 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.592 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.592 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj20 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.621 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.621 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj21 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.650 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.651 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj22 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj23 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.705 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.705 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj24 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj25 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.762 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.762 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj26 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj27 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj28 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj29 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.878 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj30 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj31 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj32 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.965 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.965 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj33 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:51.992 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:51.992 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj34 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.018 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.019 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj35 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj36 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj37 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj38 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.125 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.125 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj39 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.151 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.151 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj40 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj41 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj42 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj43 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.260 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.260 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj44 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj45 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj46 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.344 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.344 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj47 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.373 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.373 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj48 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.401 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.401 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj49 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj50 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.458 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.458 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj51 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.486 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.486 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj52 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj53 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.543 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.543 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj54 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.570 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.571 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj55 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj56 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.627 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.627 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj57 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.657 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.657 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj58 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj59 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj60 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.740 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.740 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj61 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.768 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.768 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj62 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj63 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj64 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.861 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.861 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj65 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.954 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.955 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj66 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:52.983 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:52.983 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj67 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj68 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj69 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj70 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.097 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.097 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj71 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj72 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj73 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.190 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.191 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj74 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj75 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj76 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.289 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.289 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj77 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj78 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj79 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj80 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj81 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj82 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.457 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.457 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj83 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.485 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.485 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj84 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.514 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.514 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj85 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.543 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.543 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj86 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.571 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.571 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj87 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj88 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.628 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.628 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj89 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.657 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.658 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj90 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj91 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj92 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.742 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.743 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj93 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj94 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj95 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.831 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.831 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj96 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj97 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.885 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.885 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj98 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.912 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.912 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj99 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.945 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:637: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:52:53.945 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:639: TEST_ec_recovery_unfound: rados --pool pool-jerasure put obj100 td/test-erasure-eio/ORIGINAL 2026-03-08T22:52:53.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:642: TEST_ec_recovery_unfound: inject_eio ec data pool-jerasure obj75 td/test-erasure-eio 0 2026-03-08T22:52:53.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=ec 2026-03-08T22:52:53.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T22:52:53.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=data 2026-03-08T22:52:53.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T22:52:53.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=pool-jerasure 2026-03-08T22:52:53.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T22:52:53.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=obj75 2026-03-08T22:52:53.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T22:52:53.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/test-erasure-eio 2026-03-08T22:52:53.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T22:52:53.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=0 2026-03-08T22:52:53.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T22:52:53.978 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds pool-jerasure obj75 2026-03-08T22:52:53.978 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:52:53.978 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj75 2026-03-08T22:52:53.978 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj75 2026-03-08T22:52:53.979 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:52:54.209 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:52:54.210 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:52:54.210 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:52:54.210 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:52:54.210 INFO:tasks.workunit.client.0.vm04.stderr:2147483647' 2026-03-08T22:52:54.210 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 4 0 2147483647 2026-03-08T22:52:54.210 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('3' '1' '4' '0' '2147483647') 2026-03-08T22:52:54.210 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T22:52:54.210 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=3 2026-03-08T22:52:54.210 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' ec '!=' ec ']' 2026-03-08T22:52:54.210 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/test-erasure-eio/3/type 2026-03-08T22:52:54.211 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T22:52:54.211 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 3 bluestore_debug_inject_read_err true 2026-03-08T22:52:54.211 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T22:52:54.211 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=3 2026-03-08T22:52:54.211 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T22:52:54.211 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T22:52:54.212 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T22:52:54.212 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.3 2026-03-08T22:52:54.212 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.3 2026-03-08T22:52:54.212 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.3 ']' 2026-03-08T22:52:54.212 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:52:54.213 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:54.213 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:54.213 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.3.asok 2026-03-08T22:52:54.213 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.102080/ceph-osd.3.asok config set bluestore_debug_inject_read_err true 2026-03-08T22:52:54.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T22:52:54.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T22:52:54.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T22:52:54.271 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.3 2026-03-08T22:52:54.271 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.3 2026-03-08T22:52:54.271 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.3 ']' 2026-03-08T22:52:54.271 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:52:54.271 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:54.271 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:54.271 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.3.asok 2026-03-08T22:52:54.272 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T22:52:54.272 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.3.asok injectdataerr pool-jerasure obj75 0 2026-03-08T22:52:54.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:643: TEST_ec_recovery_unfound: inject_eio ec data pool-jerasure obj75 td/test-erasure-eio 1 2026-03-08T22:52:54.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=ec 2026-03-08T22:52:54.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T22:52:54.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=data 2026-03-08T22:52:54.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T22:52:54.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=pool-jerasure 2026-03-08T22:52:54.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T22:52:54.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=obj75 2026-03-08T22:52:54.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T22:52:54.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/test-erasure-eio 2026-03-08T22:52:54.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T22:52:54.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=1 2026-03-08T22:52:54.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T22:52:54.329 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds pool-jerasure obj75 2026-03-08T22:52:54.329 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:52:54.329 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj75 2026-03-08T22:52:54.329 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj75 2026-03-08T22:52:54.330 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:52:54.553 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:52:54.553 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:52:54.553 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:52:54.553 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:52:54.553 INFO:tasks.workunit.client.0.vm04.stderr:2147483647' 2026-03-08T22:52:54.553 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 4 0 2147483647 2026-03-08T22:52:54.553 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('3' '1' '4' '0' '2147483647') 2026-03-08T22:52:54.553 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T22:52:54.553 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=1 2026-03-08T22:52:54.553 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' ec '!=' ec ']' 2026-03-08T22:52:54.554 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/test-erasure-eio/1/type 2026-03-08T22:52:54.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T22:52:54.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 1 bluestore_debug_inject_read_err true 2026-03-08T22:52:54.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T22:52:54.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=1 2026-03-08T22:52:54.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T22:52:54.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T22:52:54.555 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T22:52:54.556 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.1 2026-03-08T22:52:54.556 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:52:54.556 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:52:54.556 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:52:54.556 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:54.556 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:54.556 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.1.asok 2026-03-08T22:52:54.557 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.102080/ceph-osd.1.asok config set bluestore_debug_inject_read_err true 2026-03-08T22:52:54.613 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T22:52:54.613 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T22:52:54.613 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T22:52:54.614 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.1 2026-03-08T22:52:54.614 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:52:54.614 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:52:54.614 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:52:54.614 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:54.614 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:54.614 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.1.asok 2026-03-08T22:52:54.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T22:52:54.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.1.asok injectdataerr pool-jerasure obj75 1 2026-03-08T22:52:54.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:645: TEST_ec_recovery_unfound: activate_osd td/test-erasure-eio 2 2026-03-08T22:52:54.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T22:52:54.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:52:54.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T22:52:54.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:52:54.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T22:52:54.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:52:54.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:52:54.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:52:54.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:52:54.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T22:52:54.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T22:52:54.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:52:54.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:52:54.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:52:54.669 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:52:54.669 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:52:54.669 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:52:54.670 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:52:54.670 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:54.670 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:52:54.670 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:52:54.670 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:52:54.670 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:52:54.670 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:52:54.670 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:52:54.670 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:52:54.671 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:52:54.671 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:52:54.671 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:52:54.671 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:52:54.671 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:52:54.671 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T22:52:54.671 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T22:52:54.671 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T22:52:54.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:52:54.672 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/2/whoami 2026-03-08T22:52:54.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T22:52:54.673 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:52:54.674 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:52:54.675 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:52:54.690 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:54.690+0000 7fe1c32ba780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:54.691 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:54.692+0000 7fe1c32ba780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:54.694 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:54.694+0000 7fe1c32ba780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:54.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T22:52:54.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:52:54.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:52:54.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:52:54.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:52:54.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:54.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:52:54.891 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:52:54.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:54.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:52:55.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:55.250 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:55.250+0000 7fe1c32ba780 -1 Falling back to public interface 2026-03-08T22:52:56.128 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:52:56.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:56.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:56.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:52:56.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:56.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:52:56.360 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:56.379 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:52:56.380+0000 7fe1c32ba780 -1 osd.2 42 log_to_monitors true 2026-03-08T22:52:57.361 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:52:57.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:57.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:57.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:52:57.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:57.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:52:57.595 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up out weight 0 up_from 48 up_thru 19 down_at 43 last_clean_interval [19,42) [v2:127.0.0.1:6818/1955793577,v1:127.0.0.1:6819/1955793577] [v2:127.0.0.1:6820/1955793577,v1:127.0.0.1:6821/1955793577] exists,up e0ece297-25ce-4956-b4fe-0bcc6ff9f3cb 2026-03-08T22:52:57.648 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:52:57.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:52:57.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:52:57.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:646: TEST_ec_recovery_unfound: ceph osd in 2 2026-03-08T22:52:58.318 INFO:tasks.workunit.client.0.vm04.stderr:osd.2 is already in. 2026-03-08T22:52:58.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:648: TEST_ec_recovery_unfound: sleep 15 2026-03-08T22:53:13.331 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:650: TEST_ec_recovery_unfound: seq 1 100 2026-03-08T22:53:13.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:650: TEST_ec_recovery_unfound: for tmp in $(seq 1 100) 2026-03-08T22:53:13.334 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:651: TEST_ec_recovery_unfound: get_state 2.0 2026-03-08T22:53:13.334 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:60: get_state: local pgid=2.0 2026-03-08T22:53:13.334 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:61: get_state: local sname=state 2026-03-08T22:53:13.335 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:62: get_state: ceph --format json pg dump pgs 2026-03-08T22:53:13.335 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:63: get_state: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .state' 2026-03-08T22:53:13.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:651: TEST_ec_recovery_unfound: state=active+recovery_unfound+degraded 2026-03-08T22:53:13.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:652: TEST_ec_recovery_unfound: echo active+recovery_unfound+degraded 2026-03-08T22:53:13.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:652: TEST_ec_recovery_unfound: grep recovery_unfound 2026-03-08T22:53:13.559 INFO:tasks.workunit.client.0.vm04.stdout:active+recovery_unfound+degraded 2026-03-08T22:53:13.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:653: TEST_ec_recovery_unfound: '[' 0 = 0 ']' 2026-03-08T22:53:13.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:654: TEST_ec_recovery_unfound: break 2026-03-08T22:53:13.560 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:660: TEST_ec_recovery_unfound: ceph pg dump pgs 2026-03-08T22:53:13.763 INFO:tasks.workunit.client.0.vm04.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:53:13.763 INFO:tasks.workunit.client.0.vm04.stdout:2.0 101 1 3 0 1 413696 0 0 101 0 101 active+recovery_unfound+degraded 2026-03-08T22:52:59.032895+0000 46'101 51:346 [3,1,4,0,2] 3 [3,1,4,0,2] 3 0'0 2026-03-08T22:52:44.210362+0000 0'0 2026-03-08T22:52:44.210362+0000 0 0 no scrub is scheduled 0 0 2026-03-08T22:53:13.763 INFO:tasks.workunit.client.0.vm04.stdout:1.3 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:52:58.487346+0000 0'0 51:43 [4,1,2] 4 [4,1,2] 4 0'0 2026-03-08T22:52:10.235072+0000 0'0 2026-03-08T22:52:10.235072+0000 0 0 periodic scrub scheduled @ 2026-03-09T23:37:04.588473+0000 0 0 2026-03-08T22:53:13.763 INFO:tasks.workunit.client.0.vm04.stdout:1.2 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:52:45.006834+0000 0'0 51:111 [0,4,1] 0 [0,4,1] 0 0'0 2026-03-08T22:52:10.235072+0000 0'0 2026-03-08T22:52:10.235072+0000 0 0 periodic scrub scheduled @ 2026-03-10T06:52:19.099364+0000 0 0 2026-03-08T22:53:13.763 INFO:tasks.workunit.client.0.vm04.stdout:1.1 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:52:44.422580+0000 0'0 51:28 [4,3,0] 4 [4,3,0] 4 0'0 2026-03-08T22:52:10.235072+0000 0'0 2026-03-08T22:52:10.235072+0000 0 0 periodic scrub scheduled @ 2026-03-10T04:42:38.302799+0000 0 0 2026-03-08T22:53:13.763 INFO:tasks.workunit.client.0.vm04.stdout:1.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:52:58.901173+0000 0'0 51:98 [1,0,2] 1 [1,0,2] 1 0'0 2026-03-08T22:52:10.235072+0000 0'0 2026-03-08T22:52:10.235072+0000 0 0 periodic scrub scheduled @ 2026-03-10T04:16:53.240227+0000 0 0 2026-03-08T22:53:13.763 INFO:tasks.workunit.client.0.vm04.stdout: 2026-03-08T22:53:13.763 INFO:tasks.workunit.client.0.vm04.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:53:13.764 INFO:tasks.workunit.client.0.vm04.stderr:dumped pgs 2026-03-08T22:53:13.774 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:661: TEST_ec_recovery_unfound: ceph pg 2.0 list_unfound 2026-03-08T22:53:13.844 INFO:tasks.workunit.client.0.vm04.stdout:{ 2026-03-08T22:53:13.844 INFO:tasks.workunit.client.0.vm04.stdout: "num_missing": 1, 2026-03-08T22:53:13.844 INFO:tasks.workunit.client.0.vm04.stdout: "num_unfound": 1, 2026-03-08T22:53:13.844 INFO:tasks.workunit.client.0.vm04.stdout: "objects": [ 2026-03-08T22:53:13.844 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:53:13.844 INFO:tasks.workunit.client.0.vm04.stdout: "oid": { 2026-03-08T22:53:13.844 INFO:tasks.workunit.client.0.vm04.stdout: "oid": "obj75", 2026-03-08T22:53:13.844 INFO:tasks.workunit.client.0.vm04.stdout: "key": "", 2026-03-08T22:53:13.844 INFO:tasks.workunit.client.0.vm04.stdout: "snapid": -2, 2026-03-08T22:53:13.844 INFO:tasks.workunit.client.0.vm04.stdout: "hash": 3565005477, 2026-03-08T22:53:13.844 INFO:tasks.workunit.client.0.vm04.stdout: "max": 0, 2026-03-08T22:53:13.844 INFO:tasks.workunit.client.0.vm04.stdout: "pool": 2, 2026-03-08T22:53:13.844 INFO:tasks.workunit.client.0.vm04.stdout: "namespace": "" 2026-03-08T22:53:13.844 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.844 INFO:tasks.workunit.client.0.vm04.stdout: "need": "46'76", 2026-03-08T22:53:13.844 INFO:tasks.workunit.client.0.vm04.stdout: "have": "0'0", 2026-03-08T22:53:13.844 INFO:tasks.workunit.client.0.vm04.stdout: "flags": "none", 2026-03-08T22:53:13.844 INFO:tasks.workunit.client.0.vm04.stdout: "clean_regions": "clean_offsets: [], clean_omap: false, new_object: true", 2026-03-08T22:53:13.845 INFO:tasks.workunit.client.0.vm04.stdout: "locations": [ 2026-03-08T22:53:13.845 INFO:tasks.workunit.client.0.vm04.stdout: "0(3)", 2026-03-08T22:53:13.845 INFO:tasks.workunit.client.0.vm04.stdout: "4(2)" 2026-03-08T22:53:13.845 INFO:tasks.workunit.client.0.vm04.stdout: ] 2026-03-08T22:53:13.845 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:53:13.845 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.845 INFO:tasks.workunit.client.0.vm04.stdout: "state": "NotRecovering", 2026-03-08T22:53:13.845 INFO:tasks.workunit.client.0.vm04.stdout: "available_might_have_unfound": true, 2026-03-08T22:53:13.845 INFO:tasks.workunit.client.0.vm04.stdout: "might_have_unfound": [], 2026-03-08T22:53:13.845 INFO:tasks.workunit.client.0.vm04.stdout: "more": false 2026-03-08T22:53:13.845 INFO:tasks.workunit.client.0.vm04.stdout:} 2026-03-08T22:53:13.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:662: TEST_ec_recovery_unfound: ceph pg 2.0 query 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout:{ 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "snap_trimq": "[]", 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "snap_trimq_len": 0, 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "state": "active+recovery_unfound+degraded", 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "epoch": 51, 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "up": [ 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: 2 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "acting": [ 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: 2 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "acting_recovery_backfill": [ 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "0(3)", 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "1(1)", 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "2(4)", 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "3(0)", 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "4(2)" 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "info": { 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "pgid": "2.0s0", 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "shared": "0", 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "last_update": "46'101", 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "last_complete": "46'75", 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "log_tail": "0'0", 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "last_user_version": 101, 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "last_backfill": "MAX", 2026-03-08T22:53:13.921 INFO:tasks.workunit.client.0.vm04.stdout: "purged_snaps": [], 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "history": { 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "epoch_created": 40, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "epoch_pool_created": 40, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_started": 50, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_interval_started": 49, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_clean": 41, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_interval_clean": 40, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_split": 0, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_marked_full": 0, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "same_up_since": 49, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "same_interval_since": 49, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "same_primary_since": 40, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub": "0'0", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub": "0'0", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "prior_readable_until_ub": 0 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "stats": { 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "version": "46'101", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "reported_seq": 347, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "reported_epoch": 51, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "state": "active+recovery_unfound+degraded", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_fresh": "2026-03-08T22:52:59.032895+0000", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_change": "2026-03-08T22:52:59.032895+0000", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_active": "2026-03-08T22:52:59.032895+0000", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_peered": "2026-03-08T22:52:59.032895+0000", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean": "2026-03-08T22:52:49.682370+0000", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_became_active": "2026-03-08T22:52:58.483619+0000", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_became_peered": "2026-03-08T22:52:58.483619+0000", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_unstale": "2026-03-08T22:52:59.032895+0000", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_undegraded": "2026-03-08T22:52:58.465325+0000", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_fullsized": "2026-03-08T22:52:59.032895+0000", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "mapping_epoch": 49, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "log_start": "0'0", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "ondisk_log_start": "0'0", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "created": 40, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_clean": 41, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "parent": "0.0", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "parent_split_bits": 0, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub": "0'0", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub": "0'0", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "objects_scrubbed": 0, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "log_size": 101, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "log_dups_size": 0, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "ondisk_log_size": 101, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "stats_invalid": false, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "dirty_stats_invalid": false, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "omap_stats_invalid": false, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "hitset_stats_invalid": false, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "hitset_bytes_stats_invalid": false, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "pin_stats_invalid": false, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "manifest_stats_invalid": false, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "snaptrimq_len": 0, 2026-03-08T22:53:13.922 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_duration": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "scrub_schedule": "no scrub is scheduled", 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "scrub_duration": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "objects_trimmed": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "snaptrim_duration": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "stat_sum": { 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes": 413696, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects": 101, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_object_clones": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_object_copies": 505, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_missing_on_primary": 1, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_missing": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_degraded": 3, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_misplaced": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_unfound": 1, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_dirty": 101, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_whiteouts": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_read": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_read_kb": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_write": 101, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_write_kb": 404, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_scrub_errors": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_shallow_scrub_errors": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_deep_scrub_errors": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_recovered": 99, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes_recovered": 405504, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_keys_recovered": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_omap": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_hit_set_archive": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes_hit_set_archive": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_kb": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_kb": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_promote": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_mode_high": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_mode_low": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_mode_some": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_mode_full": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_pinned": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_legacy_snapsets": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_large_omap_objects": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_manifest": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_omap_bytes": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_omap_keys": 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_repaired": 0 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "up": [ 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: 2 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "acting": [ 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: 2 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.923 INFO:tasks.workunit.client.0.vm04.stdout: "avail_no_missing": [ 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "0(3)", 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "4(2)" 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "object_location_counts": [ 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "shards": "0(3),1(1),2(4),3(0),4(2)", 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "objects": 100 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "shards": "0(3),4(2)", 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "objects": 1 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "blocked_by": [], 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "up_primary": 3, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "acting_primary": 3, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "purged_snaps": [] 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "empty": 0, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "dne": 0, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "incomplete": 0, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_started": 50, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "hit_set_history": { 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "current_last_update": "0'0", 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "history": [] 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "peer_info": [ 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "peer": "0(3)", 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "pgid": "2.0s3", 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "shared": "3", 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "last_update": "46'101", 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "last_complete": "46'101", 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "log_tail": "0'0", 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "last_user_version": 101, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "last_backfill": "MAX", 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "purged_snaps": [], 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "history": { 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "epoch_created": 40, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "epoch_pool_created": 40, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_started": 50, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "last_interval_started": 49, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_clean": 41, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "last_interval_clean": 40, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_split": 0, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_marked_full": 0, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "same_up_since": 49, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "same_interval_since": 49, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "same_primary_since": 40, 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub": "0'0", 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub": "0'0", 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.924 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "prior_readable_until_ub": 0 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "stats": { 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "version": "46'101", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "reported_seq": 130, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "reported_epoch": 46, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "state": "active+undersized+degraded", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "last_fresh": "2026-03-08T22:52:53.975083+0000", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "last_change": "2026-03-08T22:52:50.190723+0000", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "last_active": "2026-03-08T22:52:53.975083+0000", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "last_peered": "2026-03-08T22:52:53.975083+0000", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean": "2026-03-08T22:52:49.682370+0000", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "last_became_active": "2026-03-08T22:52:50.190723+0000", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "last_became_peered": "2026-03-08T22:52:50.190723+0000", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "last_unstale": "2026-03-08T22:52:53.975083+0000", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "last_undegraded": "2026-03-08T22:52:50.175452+0000", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "last_fullsized": "2026-03-08T22:52:50.175360+0000", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "mapping_epoch": 49, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "log_start": "0'0", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "ondisk_log_start": "0'0", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "created": 40, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_clean": 41, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "parent": "0.0", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "parent_split_bits": 0, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub": "0'0", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub": "0'0", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "objects_scrubbed": 0, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "log_size": 101, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "log_dups_size": 0, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "ondisk_log_size": 101, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "stats_invalid": false, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "dirty_stats_invalid": false, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "omap_stats_invalid": false, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "hitset_stats_invalid": false, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "hitset_bytes_stats_invalid": false, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "pin_stats_invalid": false, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "manifest_stats_invalid": false, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "snaptrimq_len": 0, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_duration": 0, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "scrub_schedule": "periodic scrub scheduled @ 2026-03-10T10:40:40.258384+0000", 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "scrub_duration": 0, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "objects_trimmed": 0, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "snaptrim_duration": 0, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "stat_sum": { 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes": 413696, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects": 101, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "num_object_clones": 0, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "num_object_copies": 505, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_missing_on_primary": 0, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_missing": 0, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_degraded": 101, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_misplaced": 0, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_unfound": 0, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_dirty": 101, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "num_whiteouts": 0, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "num_read": 0, 2026-03-08T22:53:13.925 INFO:tasks.workunit.client.0.vm04.stdout: "num_read_kb": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_write": 101, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_write_kb": 404, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_scrub_errors": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_shallow_scrub_errors": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_deep_scrub_errors": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_recovered": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes_recovered": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_keys_recovered": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_omap": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_hit_set_archive": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes_hit_set_archive": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_kb": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_kb": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_promote": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_mode_high": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_mode_low": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_mode_some": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_mode_full": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_pinned": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_legacy_snapsets": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_large_omap_objects": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_manifest": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_omap_bytes": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_omap_keys": 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_repaired": 0 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "up": [ 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: 2 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "acting": [ 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: 2 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "avail_no_missing": [ 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "3(0)", 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "0(3)", 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "1(1)", 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "4(2)" 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "object_location_counts": [ 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "shards": "0(3),1(1),3(0),4(2)", 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "objects": 101 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "blocked_by": [], 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "up_primary": 3, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "acting_primary": 3, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "purged_snaps": [] 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.926 INFO:tasks.workunit.client.0.vm04.stdout: "empty": 0, 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "dne": 0, 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "incomplete": 0, 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_started": 50, 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "hit_set_history": { 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "current_last_update": "0'0", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "history": [] 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "peer": "1(1)", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "pgid": "2.0s1", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "shared": "1", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_update": "46'101", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_complete": "46'101", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "log_tail": "0'0", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_user_version": 101, 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_backfill": "MAX", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "purged_snaps": [], 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "history": { 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "epoch_created": 40, 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "epoch_pool_created": 40, 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_started": 50, 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_interval_started": 49, 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_clean": 41, 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_interval_clean": 40, 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_split": 0, 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_marked_full": 0, 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "same_up_since": 49, 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "same_interval_since": 49, 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "same_primary_since": 40, 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub": "0'0", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub": "0'0", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "prior_readable_until_ub": 0 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "stats": { 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "version": "46'101", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "reported_seq": 130, 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "reported_epoch": 46, 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "state": "active+undersized+degraded", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_fresh": "2026-03-08T22:52:53.975083+0000", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_change": "2026-03-08T22:52:50.190723+0000", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_active": "2026-03-08T22:52:53.975083+0000", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_peered": "2026-03-08T22:52:53.975083+0000", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean": "2026-03-08T22:52:49.682370+0000", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_became_active": "2026-03-08T22:52:50.190723+0000", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_became_peered": "2026-03-08T22:52:50.190723+0000", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_unstale": "2026-03-08T22:52:53.975083+0000", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_undegraded": "2026-03-08T22:52:50.175452+0000", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "last_fullsized": "2026-03-08T22:52:50.175360+0000", 2026-03-08T22:53:13.927 INFO:tasks.workunit.client.0.vm04.stdout: "mapping_epoch": 49, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "log_start": "0'0", 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "ondisk_log_start": "0'0", 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "created": 40, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_clean": 41, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "parent": "0.0", 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "parent_split_bits": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub": "0'0", 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub": "0'0", 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "objects_scrubbed": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "log_size": 101, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "log_dups_size": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "ondisk_log_size": 101, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "stats_invalid": false, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "dirty_stats_invalid": false, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "omap_stats_invalid": false, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "hitset_stats_invalid": false, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "hitset_bytes_stats_invalid": false, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "pin_stats_invalid": false, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "manifest_stats_invalid": false, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "snaptrimq_len": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_duration": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "scrub_schedule": "periodic scrub scheduled @ 2026-03-10T10:40:40.258384+0000", 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "scrub_duration": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "objects_trimmed": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "snaptrim_duration": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "stat_sum": { 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes": 413696, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects": 101, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_object_clones": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_object_copies": 505, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_missing_on_primary": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_missing": 1, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_degraded": 101, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_misplaced": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_unfound": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_dirty": 101, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_whiteouts": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_read": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_read_kb": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_write": 101, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_write_kb": 404, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_scrub_errors": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_shallow_scrub_errors": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_deep_scrub_errors": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_recovered": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes_recovered": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_keys_recovered": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_omap": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_hit_set_archive": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes_hit_set_archive": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_kb": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict": 0, 2026-03-08T22:53:13.928 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_kb": 0, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "num_promote": 0, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_mode_high": 0, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_mode_low": 0, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_mode_some": 0, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_mode_full": 0, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_pinned": 0, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "num_legacy_snapsets": 0, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "num_large_omap_objects": 0, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_manifest": 0, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "num_omap_bytes": 0, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "num_omap_keys": 0, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_repaired": 0 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "up": [ 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: 2 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "acting": [ 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: 2 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "avail_no_missing": [ 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "3(0)", 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "0(3)", 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "1(1)", 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "4(2)" 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "object_location_counts": [ 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "shards": "0(3),1(1),3(0),4(2)", 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "objects": 101 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "blocked_by": [], 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "up_primary": 3, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "acting_primary": 3, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "purged_snaps": [] 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "empty": 0, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "dne": 0, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "incomplete": 0, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_started": 50, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "hit_set_history": { 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "current_last_update": "0'0", 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "history": [] 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "peer": "2(4)", 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "pgid": "2.0s4", 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "shared": "4", 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "last_update": "46'101", 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "last_complete": "42'1", 2026-03-08T22:53:13.929 INFO:tasks.workunit.client.0.vm04.stdout: "log_tail": "0'0", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_user_version": 1, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_backfill": "MAX", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "purged_snaps": [], 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "history": { 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "epoch_created": 40, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "epoch_pool_created": 40, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_started": 50, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_interval_started": 49, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_clean": 41, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_interval_clean": 40, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_split": 0, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_marked_full": 0, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "same_up_since": 49, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "same_interval_since": 49, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "same_primary_since": 40, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub": "0'0", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub": "0'0", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "prior_readable_until_ub": 0 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "stats": { 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "version": "42'1", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "reported_seq": 17, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "reported_epoch": 42, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "state": "active+clean", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_fresh": "2026-03-08T22:52:49.682370+0000", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_change": "2026-03-08T22:52:44.325923+0000", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_active": "2026-03-08T22:52:49.682370+0000", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_peered": "2026-03-08T22:52:49.682370+0000", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean": "2026-03-08T22:52:49.682370+0000", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_became_active": "2026-03-08T22:52:44.325770+0000", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_became_peered": "2026-03-08T22:52:44.325770+0000", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_unstale": "2026-03-08T22:52:49.682370+0000", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_undegraded": "2026-03-08T22:52:49.682370+0000", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_fullsized": "2026-03-08T22:52:49.682370+0000", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "mapping_epoch": 49, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "log_start": "0'0", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "ondisk_log_start": "0'0", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "created": 40, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_clean": 41, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "parent": "0.0", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "parent_split_bits": 0, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub": "0'0", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub": "0'0", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "objects_scrubbed": 0, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "log_size": 1, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "log_dups_size": 0, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "ondisk_log_size": 1, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "stats_invalid": false, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "dirty_stats_invalid": false, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "omap_stats_invalid": false, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "hitset_stats_invalid": false, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "hitset_bytes_stats_invalid": false, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "pin_stats_invalid": false, 2026-03-08T22:53:13.930 INFO:tasks.workunit.client.0.vm04.stdout: "manifest_stats_invalid": false, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "snaptrimq_len": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_duration": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "scrub_schedule": "periodic scrub scheduled @ 2026-03-09T23:50:06.387830+0000", 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "scrub_duration": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "objects_trimmed": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "snaptrim_duration": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "stat_sum": { 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes": 4096, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects": 1, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_object_clones": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_object_copies": 5, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_missing_on_primary": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_missing": 1, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_degraded": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_misplaced": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_unfound": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_dirty": 1, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_whiteouts": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_read": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_read_kb": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_write": 1, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_write_kb": 4, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_scrub_errors": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_shallow_scrub_errors": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_deep_scrub_errors": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_recovered": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes_recovered": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_keys_recovered": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_omap": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_hit_set_archive": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes_hit_set_archive": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_kb": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_kb": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_promote": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_mode_high": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_mode_low": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_mode_some": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_mode_full": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_pinned": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_legacy_snapsets": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_large_omap_objects": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_manifest": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_omap_bytes": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_omap_keys": 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_repaired": 0 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "up": [ 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: 2 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: "acting": [ 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:53:13.931 INFO:tasks.workunit.client.0.vm04.stdout: 2 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "avail_no_missing": [], 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "object_location_counts": [], 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "blocked_by": [], 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "up_primary": 3, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "acting_primary": 3, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "purged_snaps": [] 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "empty": 0, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "dne": 0, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "incomplete": 0, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_started": 50, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "hit_set_history": { 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "current_last_update": "0'0", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "history": [] 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "peer": "4(2)", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "pgid": "2.0s2", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "shared": "2", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_update": "46'101", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_complete": "46'101", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "log_tail": "0'0", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_user_version": 101, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_backfill": "MAX", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "purged_snaps": [], 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "history": { 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "epoch_created": 40, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "epoch_pool_created": 40, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_started": 50, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_interval_started": 49, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_clean": 41, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_interval_clean": 40, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_split": 0, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_marked_full": 0, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "same_up_since": 49, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "same_interval_since": 49, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "same_primary_since": 40, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub": "0'0", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub": "0'0", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "prior_readable_until_ub": 0 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "stats": { 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "version": "46'101", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "reported_seq": 130, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "reported_epoch": 46, 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "state": "active+undersized+degraded", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_fresh": "2026-03-08T22:52:53.975083+0000", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_change": "2026-03-08T22:52:50.190723+0000", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_active": "2026-03-08T22:52:53.975083+0000", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_peered": "2026-03-08T22:52:53.975083+0000", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean": "2026-03-08T22:52:49.682370+0000", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_became_active": "2026-03-08T22:52:50.190723+0000", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_became_peered": "2026-03-08T22:52:50.190723+0000", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_unstale": "2026-03-08T22:52:53.975083+0000", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_undegraded": "2026-03-08T22:52:50.175452+0000", 2026-03-08T22:53:13.932 INFO:tasks.workunit.client.0.vm04.stdout: "last_fullsized": "2026-03-08T22:52:50.175360+0000", 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "mapping_epoch": 49, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "log_start": "0'0", 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "ondisk_log_start": "0'0", 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "created": 40, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_clean": 41, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "parent": "0.0", 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "parent_split_bits": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub": "0'0", 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub": "0'0", 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "last_deep_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "last_clean_scrub_stamp": "2026-03-08T22:52:44.210362+0000", 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "objects_scrubbed": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "log_size": 101, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "log_dups_size": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "ondisk_log_size": 101, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "stats_invalid": false, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "dirty_stats_invalid": false, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "omap_stats_invalid": false, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "hitset_stats_invalid": false, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "hitset_bytes_stats_invalid": false, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "pin_stats_invalid": false, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "manifest_stats_invalid": false, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "snaptrimq_len": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "last_scrub_duration": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "scrub_schedule": "periodic scrub scheduled @ 2026-03-10T10:40:40.258384+0000", 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "scrub_duration": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "objects_trimmed": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "snaptrim_duration": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "stat_sum": { 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes": 413696, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects": 101, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_object_clones": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_object_copies": 505, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_missing_on_primary": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_missing": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_degraded": 101, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_misplaced": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_unfound": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_dirty": 101, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_whiteouts": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_read": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_read_kb": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_write": 101, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_write_kb": 404, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_scrub_errors": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_shallow_scrub_errors": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_deep_scrub_errors": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_recovered": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes_recovered": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_keys_recovered": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_omap": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_hit_set_archive": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_bytes_hit_set_archive": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_kb": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_kb": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_promote": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_mode_high": 0, 2026-03-08T22:53:13.933 INFO:tasks.workunit.client.0.vm04.stdout: "num_flush_mode_low": 0, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_mode_some": 0, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "num_evict_mode_full": 0, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_pinned": 0, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "num_legacy_snapsets": 0, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "num_large_omap_objects": 0, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_manifest": 0, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "num_omap_bytes": 0, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "num_omap_keys": 0, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "num_objects_repaired": 0 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "up": [ 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: 2 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "acting": [ 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: 3, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: 1, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: 4, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: 0, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: 2 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "avail_no_missing": [ 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "3(0)", 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "0(3)", 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "1(1)", 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "4(2)" 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "object_location_counts": [ 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "shards": "0(3),1(1),3(0),4(2)", 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "objects": 101 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "blocked_by": [], 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "up_primary": 3, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "acting_primary": 3, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "purged_snaps": [] 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "empty": 0, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "dne": 0, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "incomplete": 0, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "last_epoch_started": 50, 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "hit_set_history": { 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "current_last_update": "0'0", 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "history": [] 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "recovery_state": [ 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "name": "Started/Primary/Active", 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "enter_time": "2026-03-08T22:52:58.464730+0000", 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: "might_have_unfound": [ 2026-03-08T22:53:13.934 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "osd": "0(3)", 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "status": "already probed" 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "osd": "1(1)", 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "status": "already probed" 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "osd": "2(4)", 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "status": "already probed" 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "osd": "4(2)", 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "status": "already probed" 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "recovery_progress": { 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "backfill_targets": [], 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "waiting_on_backfill": [], 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "last_backfill_started": "MIN", 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "backfill_info": { 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "begin": "MIN", 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "end": "MIN", 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "objects": [] 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "peer_backfill_info": [], 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "backfills_in_flight": [], 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "recovering": [], 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "pg_backend": { 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "recovery_ops": [], 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "read_ops": [] 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: { 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "name": "Started", 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "enter_time": "2026-03-08T22:52:58.075500+0000" 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: } 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: ], 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "scrubber": { 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "active": false, 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "must_scrub": false, 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "must_deep_scrub": false, 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "must_repair": false, 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "need_auto": false, 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "scrub_reg_stamp": "2026-03-10T10:40:40.258384+0000", 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "schedule": "no scrub is scheduled" 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: }, 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout: "agent_state": {} 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stdout:} 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:664: TEST_ec_recovery_unfound: ceph pg 2.0 list_unfound 2026-03-08T22:53:13.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:664: TEST_ec_recovery_unfound: grep -q obj75 2026-03-08T22:53:14.010 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:666: TEST_ec_recovery_unfound: ceph pg 2.0 list_unfound 2026-03-08T22:53:14.011 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:666: TEST_ec_recovery_unfound: jq .available_might_have_unfound 2026-03-08T22:53:14.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:666: TEST_ec_recovery_unfound: check=true 2026-03-08T22:53:14.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:667: TEST_ec_recovery_unfound: test true == true 2026-03-08T22:53:14.094 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:669: TEST_ec_recovery_unfound: ceph pg 2.0 list_unfound 2026-03-08T22:53:14.094 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:669: TEST_ec_recovery_unfound: jq '.might_have_unfound | length' 2026-03-08T22:53:14.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:669: TEST_ec_recovery_unfound: check=0 2026-03-08T22:53:14.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:670: TEST_ec_recovery_unfound: test 0 == 0 2026-03-08T22:53:14.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:673: TEST_ec_recovery_unfound: timeout 5 rados -p pool-jerasure get obj75 td/test-erasure-eio/CHECK 2026-03-08T22:53:19.174 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:674: TEST_ec_recovery_unfound: test 124 = 124 2026-03-08T22:53:19.174 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:676: TEST_ec_recovery_unfound: ceph pg 2.0 mark_unfound_lost delete 2026-03-08T22:53:19.260 INFO:tasks.workunit.client.0.vm04.stderr:pg has no unfound objects 2026-03-08T22:53:19.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:678: TEST_ec_recovery_unfound: wait_for_clean 2026-03-08T22:53:19.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:53:19.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:53:19.269 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:53:19.270 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:53:19.270 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:53:19.270 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:53:19.270 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:53:19.270 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:53:19.270 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:53:19.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:53:19.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:53:19.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:53:19.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:53:19.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:53:19.345 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:53:19.575 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:53:19.575 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:53:19.575 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:53:19.575 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:53:19.575 INFO:tasks.workunit.client.0.vm04.stderr:4' 2026-03-08T22:53:19.575 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:53:19.575 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:19.575 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:53:19.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803793 2026-03-08T22:53:19.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803793 2026-03-08T22:53:19.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803793' 2026-03-08T22:53:19.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:19.677 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:53:19.760 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574864 2026-03-08T22:53:19.760 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574864 2026-03-08T22:53:19.760 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803793 1-55834574864' 2026-03-08T22:53:19.760 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:19.760 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:53:19.836 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=206158430215 2026-03-08T22:53:19.836 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 206158430215 2026-03-08T22:53:19.836 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803793 1-55834574864 2-206158430215' 2026-03-08T22:53:19.836 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:19.836 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:53:19.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964117005 2026-03-08T22:53:19.911 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964117005 2026-03-08T22:53:19.911 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803793 1-55834574864 2-206158430215 3-115964117005' 2026-03-08T22:53:19.911 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:19.911 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:53:19.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=154618822668 2026-03-08T22:53:19.990 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 154618822668 2026-03-08T22:53:19.990 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803793 1-55834574864 2-206158430215 3-115964117005 4-154618822668' 2026-03-08T22:53:19.990 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:19.990 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803793 2026-03-08T22:53:19.990 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:19.991 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:53:19.992 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803793 2026-03-08T22:53:19.992 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:19.993 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803793 2026-03-08T22:53:19.993 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803793' 2026-03-08T22:53:19.993 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803793 2026-03-08T22:53:19.993 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:53:20.212 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803791 -lt 25769803793 2026-03-08T22:53:20.212 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:53:21.213 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:53:21.213 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:53:21.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803793 -lt 25769803793 2026-03-08T22:53:21.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:21.446 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574864 2026-03-08T22:53:21.446 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:21.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:53:21.447 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574864 2026-03-08T22:53:21.448 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:21.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574864 2026-03-08T22:53:21.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574864' 2026-03-08T22:53:21.448 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 55834574864 2026-03-08T22:53:21.449 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:53:21.674 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574864 -lt 55834574864 2026-03-08T22:53:21.674 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:21.674 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-206158430215 2026-03-08T22:53:21.674 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:21.675 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:53:21.676 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-206158430215 2026-03-08T22:53:21.676 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:21.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=206158430215 2026-03-08T22:53:21.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 206158430215' 2026-03-08T22:53:21.677 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 206158430215 2026-03-08T22:53:21.677 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:53:21.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 206158430215 -lt 206158430215 2026-03-08T22:53:21.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:21.916 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964117005 2026-03-08T22:53:21.916 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:21.917 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:53:21.918 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964117005 2026-03-08T22:53:21.918 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:21.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964117005 2026-03-08T22:53:21.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964117005' 2026-03-08T22:53:21.919 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 115964117005 2026-03-08T22:53:21.920 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:53:22.140 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964117005 -lt 115964117005 2026-03-08T22:53:22.140 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:22.140 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-154618822668 2026-03-08T22:53:22.141 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:22.141 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:53:22.142 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-154618822668 2026-03-08T22:53:22.142 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:22.144 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=154618822668 2026-03-08T22:53:22.144 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 154618822668' 2026-03-08T22:53:22.144 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.4 seq 154618822668 2026-03-08T22:53:22.144 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:53:22.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 154618822668 -lt 154618822668 2026-03-08T22:53:22.378 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:53:22.378 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:53:22.378 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:53:22.689 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:53:22.689 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:53:22.689 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:53:22.689 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:53:22.690 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:53:22.690 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:53:22.690 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:53:22.690 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:53:22.926 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:53:22.926 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:53:22.926 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:53:22.926 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:53:23.223 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:53:23.223 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:53:23.223 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:53:23.223 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: seq 1 100 2026-03-08T22:53:23.225 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.225 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj1 = obj75 ']' 2026-03-08T22:53:23.225 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj1 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.250 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.250 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.250 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj2 = obj75 ']' 2026-03-08T22:53:23.251 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj2 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.276 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.277 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.277 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj3 = obj75 ']' 2026-03-08T22:53:23.277 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj3 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj4 = obj75 ']' 2026-03-08T22:53:23.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj4 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj5 = obj75 ']' 2026-03-08T22:53:23.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj5 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj6 = obj75 ']' 2026-03-08T22:53:23.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj6 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.377 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj7 = obj75 ']' 2026-03-08T22:53:23.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj7 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.401 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.402 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.402 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj8 = obj75 ']' 2026-03-08T22:53:23.402 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj8 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj9 = obj75 ']' 2026-03-08T22:53:23.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj9 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.451 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.451 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj10 = obj75 ']' 2026-03-08T22:53:23.451 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj10 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.478 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.478 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj11 = obj75 ']' 2026-03-08T22:53:23.478 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj11 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.504 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.505 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.505 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj12 = obj75 ']' 2026-03-08T22:53:23.505 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj12 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.532 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.533 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.533 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj13 = obj75 ']' 2026-03-08T22:53:23.533 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj13 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.568 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.569 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.569 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj14 = obj75 ']' 2026-03-08T22:53:23.569 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj14 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.596 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.597 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.597 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj15 = obj75 ']' 2026-03-08T22:53:23.597 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj15 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.623 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj16 = obj75 ']' 2026-03-08T22:53:23.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj16 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.649 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.650 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.650 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj17 = obj75 ']' 2026-03-08T22:53:23.650 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj17 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.675 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj18 = obj75 ']' 2026-03-08T22:53:23.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj18 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.701 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.702 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.702 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj19 = obj75 ']' 2026-03-08T22:53:23.702 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj19 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.728 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.729 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.729 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj20 = obj75 ']' 2026-03-08T22:53:23.729 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj20 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.755 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.756 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.756 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj21 = obj75 ']' 2026-03-08T22:53:23.756 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj21 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj22 = obj75 ']' 2026-03-08T22:53:23.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj22 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.806 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.807 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.807 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj23 = obj75 ']' 2026-03-08T22:53:23.807 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj23 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.831 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj24 = obj75 ']' 2026-03-08T22:53:23.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj24 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.856 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj25 = obj75 ']' 2026-03-08T22:53:23.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj25 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.883 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.883 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj26 = obj75 ']' 2026-03-08T22:53:23.883 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj26 td/test-erasure-eio/CHECK 2026-03-08T22:53:23.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:23.973 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:23.973 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj27 = obj75 ']' 2026-03-08T22:53:23.973 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj27 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.001 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.003 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.003 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj28 = obj75 ']' 2026-03-08T22:53:24.003 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj28 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj29 = obj75 ']' 2026-03-08T22:53:24.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj29 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.054 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.054 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj30 = obj75 ']' 2026-03-08T22:53:24.054 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj30 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.081 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj31 = obj75 ']' 2026-03-08T22:53:24.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj31 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.108 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.109 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.109 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj32 = obj75 ']' 2026-03-08T22:53:24.109 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj32 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.133 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.134 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.134 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj33 = obj75 ']' 2026-03-08T22:53:24.134 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj33 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj34 = obj75 ']' 2026-03-08T22:53:24.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj34 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.189 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.190 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.190 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj35 = obj75 ']' 2026-03-08T22:53:24.190 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj35 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj36 = obj75 ']' 2026-03-08T22:53:24.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj36 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.248 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.248 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj37 = obj75 ']' 2026-03-08T22:53:24.248 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj37 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.273 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.274 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.274 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj38 = obj75 ']' 2026-03-08T22:53:24.274 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj38 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj39 = obj75 ']' 2026-03-08T22:53:24.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj39 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj40 = obj75 ']' 2026-03-08T22:53:24.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj40 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj41 = obj75 ']' 2026-03-08T22:53:24.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj41 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj42 = obj75 ']' 2026-03-08T22:53:24.380 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj42 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.402 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj43 = obj75 ']' 2026-03-08T22:53:24.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj43 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj44 = obj75 ']' 2026-03-08T22:53:24.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj44 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.451 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj45 = obj75 ']' 2026-03-08T22:53:24.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj45 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.477 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.478 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.478 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj46 = obj75 ']' 2026-03-08T22:53:24.478 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj46 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.502 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.503 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.503 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj47 = obj75 ']' 2026-03-08T22:53:24.503 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj47 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.530 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.530 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj48 = obj75 ']' 2026-03-08T22:53:24.530 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj48 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.553 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj49 = obj75 ']' 2026-03-08T22:53:24.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj49 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.578 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj50 = obj75 ']' 2026-03-08T22:53:24.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj50 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.608 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.609 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.609 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj51 = obj75 ']' 2026-03-08T22:53:24.609 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj51 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.636 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.638 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.638 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj52 = obj75 ']' 2026-03-08T22:53:24.638 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj52 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.664 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.664 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj53 = obj75 ']' 2026-03-08T22:53:24.665 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj53 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj54 = obj75 ']' 2026-03-08T22:53:24.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj54 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.737 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.737 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj55 = obj75 ']' 2026-03-08T22:53:24.737 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj55 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.757 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.757 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.757 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj56 = obj75 ']' 2026-03-08T22:53:24.757 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj56 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.779 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj57 = obj75 ']' 2026-03-08T22:53:24.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj57 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.800 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj58 = obj75 ']' 2026-03-08T22:53:24.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj58 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.825 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj59 = obj75 ']' 2026-03-08T22:53:24.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj59 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj60 = obj75 ']' 2026-03-08T22:53:24.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj60 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.880 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.881 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.881 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj61 = obj75 ']' 2026-03-08T22:53:24.881 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj61 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj62 = obj75 ']' 2026-03-08T22:53:24.908 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj62 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj63 = obj75 ']' 2026-03-08T22:53:24.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj63 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.961 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.961 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj64 = obj75 ']' 2026-03-08T22:53:24.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj64 td/test-erasure-eio/CHECK 2026-03-08T22:53:24.990 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:24.991 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:24.991 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj65 = obj75 ']' 2026-03-08T22:53:24.991 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj65 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.020 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj66 = obj75 ']' 2026-03-08T22:53:25.021 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj66 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj67 = obj75 ']' 2026-03-08T22:53:25.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj67 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.075 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.077 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.077 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj68 = obj75 ']' 2026-03-08T22:53:25.077 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj68 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj69 = obj75 ']' 2026-03-08T22:53:25.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj69 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj70 = obj75 ']' 2026-03-08T22:53:25.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj70 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj71 = obj75 ']' 2026-03-08T22:53:25.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj71 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.181 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.182 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.182 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj72 = obj75 ']' 2026-03-08T22:53:25.182 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj72 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.208 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.209 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.209 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj73 = obj75 ']' 2026-03-08T22:53:25.209 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj73 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj74 = obj75 ']' 2026-03-08T22:53:25.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj74 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.261 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj75 = obj75 ']' 2026-03-08T22:53:25.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:684: TEST_ec_recovery_unfound: rados -p pool-jerasure get obj75 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.284 INFO:tasks.workunit.client.0.vm04.stderr:error getting pool-jerasure/obj75: (2) No such file or directory 2026-03-08T22:53:25.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj76 = obj75 ']' 2026-03-08T22:53:25.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj76 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.312 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.313 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.313 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj77 = obj75 ']' 2026-03-08T22:53:25.313 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj77 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.338 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.338 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.338 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj78 = obj75 ']' 2026-03-08T22:53:25.338 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj78 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj79 = obj75 ']' 2026-03-08T22:53:25.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj79 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj80 = obj75 ']' 2026-03-08T22:53:25.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj80 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.413 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.413 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.413 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj81 = obj75 ']' 2026-03-08T22:53:25.413 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj81 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj82 = obj75 ']' 2026-03-08T22:53:25.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj82 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.458 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj83 = obj75 ']' 2026-03-08T22:53:25.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj83 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.484 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.485 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.485 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj84 = obj75 ']' 2026-03-08T22:53:25.485 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj84 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.510 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.510 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.510 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj85 = obj75 ']' 2026-03-08T22:53:25.510 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj85 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.537 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.537 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.538 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj86 = obj75 ']' 2026-03-08T22:53:25.538 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj86 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.564 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj87 = obj75 ']' 2026-03-08T22:53:25.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj87 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.591 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj88 = obj75 ']' 2026-03-08T22:53:25.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj88 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.617 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.618 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.619 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj89 = obj75 ']' 2026-03-08T22:53:25.619 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj89 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.642 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.643 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.643 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj90 = obj75 ']' 2026-03-08T22:53:25.643 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj90 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.664 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.665 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.665 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj91 = obj75 ']' 2026-03-08T22:53:25.665 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj91 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.691 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.691 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj92 = obj75 ']' 2026-03-08T22:53:25.691 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj92 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj93 = obj75 ']' 2026-03-08T22:53:25.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj93 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.747 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj94 = obj75 ']' 2026-03-08T22:53:25.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj94 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj95 = obj75 ']' 2026-03-08T22:53:25.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj95 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.874 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj96 = obj75 ']' 2026-03-08T22:53:25.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj96 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.897 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj97 = obj75 ']' 2026-03-08T22:53:25.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj97 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj98 = obj75 ']' 2026-03-08T22:53:25.950 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj98 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.959 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.959 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.959 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj99 = obj75 ']' 2026-03-08T22:53:25.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj99 td/test-erasure-eio/CHECK 2026-03-08T22:53:25.981 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:25.982 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:680: TEST_ec_recovery_unfound: for i in $(seq 1 $lastobj) 2026-03-08T22:53:25.982 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:682: TEST_ec_recovery_unfound: '[' obj100 = obj75 ']' 2026-03-08T22:53:25.982 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:686: TEST_ec_recovery_unfound: rados --pool pool-jerasure get obj100 td/test-erasure-eio/CHECK 2026-03-08T22:53:26.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:687: TEST_ec_recovery_unfound: diff -q td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:26.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:691: TEST_ec_recovery_unfound: rm -f td/test-erasure-eio/ORIGINAL td/test-erasure-eio/CHECK 2026-03-08T22:53:26.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:693: TEST_ec_recovery_unfound: delete_erasure_coded_pool pool-jerasure 2026-03-08T22:53:26.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:84: delete_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:53:26.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:85: delete_erasure_coded_pool: ceph osd pool delete pool-jerasure pool-jerasure --yes-i-really-really-mean-it 2026-03-08T22:53:26.284 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' does not exist 2026-03-08T22:53:26.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:86: delete_erasure_coded_pool: ceph osd erasure-code-profile rm myprofile 2026-03-08T22:53:26.592 INFO:tasks.workunit.client.0.vm04.stderr:erasure-code-profile myprofile does not exist 2026-03-08T22:53:26.603 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:42: run: teardown td/test-erasure-eio 2026-03-08T22:53:26.603 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:53:26.603 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:53:26.603 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:53:26.604 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:53:26.604 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:53:26.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:53:26.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:53:26.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:53:26.742 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:53:26.742 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:53:26.743 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:53:26.743 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:53:26.744 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:53:26.744 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:53:26.745 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:53:26.745 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:53:26.745 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:53:26.746 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:53:26.746 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:53:26.747 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:53:26.747 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:53:26.747 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:53:26.777 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:53:26.777 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:26.777 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:53:26.777 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:53:26.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:53:26.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:53:26.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:32: run: for func in $funcs 2026-03-08T22:53:26.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:33: run: setup td/test-erasure-eio 2026-03-08T22:53:26.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-eio 2026-03-08T22:53:26.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-eio 2026-03-08T22:53:26.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:53:26.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:53:26.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:53:26.779 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:53:26.779 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:53:26.779 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:53:26.779 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:53:26.779 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:53:26.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:53:26.782 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:53:26.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:53:26.783 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:53:26.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:53:26.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:53:26.784 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:53:26.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:53:26.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:53:26.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:53:26.786 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:53:26.787 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:53:26.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:53:26.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:53:26.790 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:53:26.790 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:26.790 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:53:26.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:53:26.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:53:26.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:53:26.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-eio 2026-03-08T22:53:26.792 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:53:26.792 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:26.792 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:53:26.792 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.102080 2026-03-08T22:53:26.794 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:53:26.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:53:26.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:53:26.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-eio 1' TERM HUP INT 2026-03-08T22:53:26.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:34: run: run_mon td/test-erasure-eio a 2026-03-08T22:53:26.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-eio 2026-03-08T22:53:26.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:53:26.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:53:26.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:53:26.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-eio/a 2026-03-08T22:53:26.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-eio/a --run-dir=td/test-erasure-eio 2026-03-08T22:53:26.823 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:53:26.823 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:53:26.823 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:53:26.823 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:53:26.823 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:26.823 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:53:26.823 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:53:26.824 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-eio/a '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-eio/log --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:53:26.970 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:53:26.971 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:53:26.971 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:53:26.971 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:53:26.971 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:53:26.972 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:53:26.972 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:53:26.972 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:53:26.972 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:53:26.972 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:26.972 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:53:26.973 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:53:26.973 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:53:26.973 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:53:26.973 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get fsid 2026-03-08T22:53:27.028 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:53:27.028 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:53:27.028 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:53:27.028 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:53:27.028 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:53:27.028 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:53:27.028 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:53:27.028 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:53:27.029 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:53:27.029 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:27.029 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:53:27.029 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:53:27.029 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:53:27.029 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get mon_host 2026-03-08T22:53:27.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:35: run: run_mgr td/test-erasure-eio x 2026-03-08T22:53:27.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-eio 2026-03-08T22:53:27.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:53:27.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:53:27.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:53:27.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-eio/x 2026-03-08T22:53:27.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:53:27.212 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:53:27.212 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:53:27.212 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:53:27.212 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:53:27.212 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:27.212 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:53:27.213 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:53:27.213 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:53:27.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-eio/x '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:53:27.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:36: run: create_pool rbd 4 2026-03-08T22:53:27.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:53:27.361 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd' already exists 2026-03-08T22:53:27.371 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:53:28.374 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: get_asok_path mon.a 2026-03-08T22:53:28.374 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:53:28.374 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:53:28.374 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:53:28.374 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:28.374 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:53:28.374 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:53:28.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: CEPH_ARGS= 2026-03-08T22:53:28.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-mon.a.asok log flush 2026-03-08T22:53:28.421 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:53:28.425 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:40: run: grep 'load: jerasure.*lrc' td/test-erasure-eio/mon.a.log 2026-03-08T22:53:28.427 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:53:26.848+0000 7ff2b8c12d80 0 load: jerasure load: lrc 2026-03-08T22:53:28.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:41: run: TEST_ec_single_recovery_error td/test-erasure-eio 2026-03-08T22:53:28.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:390: TEST_ec_single_recovery_error: local dir=td/test-erasure-eio 2026-03-08T22:53:28.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:391: TEST_ec_single_recovery_error: local objname=myobject 2026-03-08T22:53:28.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:393: TEST_ec_single_recovery_error: setup_osds 7 2026-03-08T22:53:28.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:47: setup_osds: local count=7 2026-03-08T22:53:28.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:48: setup_osds: shift 2026-03-08T22:53:28.427 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: expr 7 - 1 2026-03-08T22:53:28.428 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: seq 0 6 2026-03-08T22:53:28.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:53:28.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 0 2026-03-08T22:53:28.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:53:28.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:53:28.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:53:28.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:53:28.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/0 2026-03-08T22:53:28.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:53:28.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:53:28.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:53:28.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:53:28.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/0' 2026-03-08T22:53:28.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/0/journal' 2026-03-08T22:53:28.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:53:28.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:53:28.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:53:28.430 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:53:28.430 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:53:28.430 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:53:28.430 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:53:28.430 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:28.430 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:53:28.430 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:53:28.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:53:28.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:53:28.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:53:28.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:53:28.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:53:28.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:53:28.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:53:28.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:53:28.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:53:28.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:53:28.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:53:28.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:53:28.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/0 2026-03-08T22:53:28.432 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:53:28.433 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=072ec6fe-853d-411c-a873-59e9907ba01c 2026-03-08T22:53:28.433 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 072ec6fe-853d-411c-a873-59e9907ba01c' 2026-03-08T22:53:28.433 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 072ec6fe-853d-411c-a873-59e9907ba01c 2026-03-08T22:53:28.433 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:53:28.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDo/a1p60KnGhAA6/+sLIRrLTVn8fxuKwv2PA== 2026-03-08T22:53:28.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDo/a1p60KnGhAA6/+sLIRrLTVn8fxuKwv2PA=="}' 2026-03-08T22:53:28.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 072ec6fe-853d-411c-a873-59e9907ba01c -i td/test-erasure-eio/0/new.json 2026-03-08T22:53:28.578 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:53:28.588 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/0/new.json 2026-03-08T22:53:28.589 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDo/a1p60KnGhAA6/+sLIRrLTVn8fxuKwv2PA== --osd-uuid 072ec6fe-853d-411c-a873-59e9907ba01c 2026-03-08T22:53:28.614 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:28.614+0000 7fa52369d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:28.619 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:28.620+0000 7fa52369d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:28.620 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:28.620+0000 7fa52369d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:28.620 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:28.621+0000 7fa52369d780 -1 bdev(0x562217a70800 td/test-erasure-eio/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:53:28.620 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:28.621+0000 7fa52369d780 -1 bluestore(td/test-erasure-eio/0) _read_fsid unparsable uuid 2026-03-08T22:53:31.252 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/0/keyring 2026-03-08T22:53:31.252 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:53:31.253 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:53:31.253 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:53:31.253 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:53:31.560 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:53:31.560 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:53:31.560 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:53:31.561 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:53:31.562 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:53:31.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:53:31.579 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:31.579+0000 7f919540d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:31.584 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:31.585+0000 7f919540d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:31.586 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:31.586+0000 7f919540d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:31.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:53:31.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:53:31.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:53:31.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:53:31.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:53:31.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:31.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:53:31.789 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:53:31.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:31.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:53:32.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:32.909 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:32.910+0000 7f919540d780 -1 Falling back to public interface 2026-03-08T22:53:33.052 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:33.052 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:33.052 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:53:33.052 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:53:33.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:33.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:53:33.281 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:33.766 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:33.767+0000 7f919540d780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:53:34.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:34.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:34.283 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:53:34.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:53:34.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:34.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:53:34.530 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:34.973 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:34.974+0000 7f919044e640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:53:35.532 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:53:35.532 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:35.532 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:35.532 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:53:35.533 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:35.533 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:53:35.780 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 6 up_thru 6 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3869464985,v1:127.0.0.1:6803/3869464985] [v2:127.0.0.1:6804/3869464985,v1:127.0.0.1:6805/3869464985] exists,up 072ec6fe-853d-411c-a873-59e9907ba01c 2026-03-08T22:53:35.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:53:35.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:53:35.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:53:35.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:53:35.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 1 2026-03-08T22:53:35.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:53:35.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:53:35.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:53:35.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:53:35.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T22:53:35.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:53:35.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:53:35.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:53:35.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:53:35.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T22:53:35.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T22:53:35.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:53:35.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:53:35.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:53:35.782 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:53:35.782 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:53:35.782 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:53:35.782 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:53:35.782 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:35.782 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:53:35.782 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:53:35.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:53:35.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:53:35.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:53:35.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:53:35.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:53:35.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:53:35.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:53:35.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:53:35.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:53:35.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:53:35.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:53:35.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:53:35.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T22:53:35.785 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:53:35.786 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=944a07b8-65fa-48b4-9b94-d42304cf8339 2026-03-08T22:53:35.786 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 944a07b8-65fa-48b4-9b94-d42304cf8339' 2026-03-08T22:53:35.786 INFO:tasks.workunit.client.0.vm04.stdout:add osd1 944a07b8-65fa-48b4-9b94-d42304cf8339 2026-03-08T22:53:35.786 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:53:35.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDv/a1p1GrRLxAAZJFjtmnNvsC/yBe3gcdpNA== 2026-03-08T22:53:35.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDv/a1p1GrRLxAAZJFjtmnNvsC/yBe3gcdpNA=="}' 2026-03-08T22:53:35.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 944a07b8-65fa-48b4-9b94-d42304cf8339 -i td/test-erasure-eio/1/new.json 2026-03-08T22:53:36.084 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:53:36.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/1/new.json 2026-03-08T22:53:36.097 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDv/a1p1GrRLxAAZJFjtmnNvsC/yBe3gcdpNA== --osd-uuid 944a07b8-65fa-48b4-9b94-d42304cf8339 2026-03-08T22:53:36.119 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:36.120+0000 7ff9a0320780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:36.121 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:36.122+0000 7ff9a0320780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:36.123 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:36.123+0000 7ff9a0320780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:36.123 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:36.124+0000 7ff9a0320780 -1 bdev(0x5570e8273c00 td/test-erasure-eio/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:53:36.123 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:36.124+0000 7ff9a0320780 -1 bluestore(td/test-erasure-eio/1) _read_fsid unparsable uuid 2026-03-08T22:53:38.741 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/1/keyring 2026-03-08T22:53:38.741 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:53:38.742 INFO:tasks.workunit.client.0.vm04.stdout:adding osd1 key to auth repository 2026-03-08T22:53:38.742 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:53:38.742 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:53:39.063 INFO:tasks.workunit.client.0.vm04.stdout:start osd.1 2026-03-08T22:53:39.063 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:53:39.063 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:53:39.063 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:53:39.064 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:53:39.067 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:53:39.084 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:39.085+0000 7f4d6bdf9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:39.091 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:39.092+0000 7f4d6bdf9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:39.093 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:39.093+0000 7f4d6bdf9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:39.308 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:53:39.308 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:53:39.308 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:53:39.308 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:53:39.308 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:53:39.308 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:53:39.308 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:39.308 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:53:39.308 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:39.308 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:53:39.542 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:39.897 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:39.898+0000 7f4d6bdf9780 -1 Falling back to public interface 2026-03-08T22:53:40.544 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:40.544 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:40.544 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:53:40.544 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:53:40.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:40.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:53:40.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:41.017 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:41.017+0000 7f4d6bdf9780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:53:41.792 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:53:41.792 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:41.792 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:41.792 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:53:41.792 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:41.792 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:53:42.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:42.100 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:42.101+0000 7f4d6759a640 -1 osd.1 0 waiting for initial osdmap 2026-03-08T22:53:43.034 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:53:43.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:43.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:43.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:53:43.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:43.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:53:43.277 INFO:tasks.workunit.client.0.vm04.stdout:osd.1 up in weight 1 up_from 13 up_thru 13 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/2122880140,v1:127.0.0.1:6811/2122880140] [v2:127.0.0.1:6812/2122880140,v1:127.0.0.1:6813/2122880140] exists,up 944a07b8-65fa-48b4-9b94-d42304cf8339 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 2 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:53:43.278 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:53:43.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:53:43.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:53:43.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:53:43.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:53:43.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:53:43.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:53:43.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:53:43.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:53:43.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:53:43.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:53:43.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:53:43.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:53:43.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T22:53:43.280 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:53:43.281 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=def12307-b3ba-4b43-8ce2-d3161081c896 2026-03-08T22:53:43.281 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 def12307-b3ba-4b43-8ce2-d3161081c896' 2026-03-08T22:53:43.281 INFO:tasks.workunit.client.0.vm04.stdout:add osd2 def12307-b3ba-4b43-8ce2-d3161081c896 2026-03-08T22:53:43.281 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:53:43.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQD3/a1pD0ykERAANdZUYbALMoKxcNCGs9paDA== 2026-03-08T22:53:43.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQD3/a1pD0ykERAANdZUYbALMoKxcNCGs9paDA=="}' 2026-03-08T22:53:43.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new def12307-b3ba-4b43-8ce2-d3161081c896 -i td/test-erasure-eio/2/new.json 2026-03-08T22:53:43.542 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:53:43.552 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/2/new.json 2026-03-08T22:53:43.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQD3/a1pD0ykERAANdZUYbALMoKxcNCGs9paDA== --osd-uuid def12307-b3ba-4b43-8ce2-d3161081c896 2026-03-08T22:53:43.574 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:43.575+0000 7f3e55a1c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:43.577 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:43.578+0000 7f3e55a1c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:43.578 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:43.579+0000 7f3e55a1c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:43.579 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:43.580+0000 7f3e55a1c780 -1 bdev(0x55bb7366bc00 td/test-erasure-eio/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:53:43.579 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:43.580+0000 7f3e55a1c780 -1 bluestore(td/test-erasure-eio/2) _read_fsid unparsable uuid 2026-03-08T22:53:45.955 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/2/keyring 2026-03-08T22:53:45.955 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:53:45.956 INFO:tasks.workunit.client.0.vm04.stdout:adding osd2 key to auth repository 2026-03-08T22:53:45.956 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:53:45.956 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:53:46.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:53:46.282 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T22:53:46.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:53:46.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:53:46.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:53:46.289 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:53:46.303 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:46.303+0000 7fbfa3038780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:46.307 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:46.308+0000 7fbfa3038780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:46.310 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:46.309+0000 7fbfa3038780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:46.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:53:46.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:53:46.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:53:46.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:53:46.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:53:46.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:46.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:53:46.582 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:53:46.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:46.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:53:46.816 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:47.376 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:47.377+0000 7fbfa3038780 -1 Falling back to public interface 2026-03-08T22:53:47.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:47.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:47.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:53:47.817 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:53:47.818 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:47.818 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:53:48.052 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:48.504 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:48.505+0000 7fbfa3038780 -1 osd.2 0 log_to_monitors true 2026-03-08T22:53:49.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:49.054 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:49.054 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:53:49.054 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:53:49.054 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:53:49.054 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:49.318 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:49.523 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:49.523+0000 7fbf9e7d9640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T22:53:50.321 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:53:50.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:50.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:50.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:53:50.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:50.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:53:50.561 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up in weight 1 up_from 19 up_thru 19 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/269309546,v1:127.0.0.1:6819/269309546] [v2:127.0.0.1:6820/269309546,v1:127.0.0.1:6821/269309546] exists,up def12307-b3ba-4b43-8ce2-d3161081c896 2026-03-08T22:53:50.562 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:53:50.562 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:53:50.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:53:50.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:53:50.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 3 2026-03-08T22:53:50.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:53:50.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:53:50.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T22:53:50.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:53:50.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/3 2026-03-08T22:53:50.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:53:50.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:53:50.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:53:50.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:53:50.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/3' 2026-03-08T22:53:50.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/3/journal' 2026-03-08T22:53:50.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:53:50.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:53:50.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:53:50.563 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:53:50.563 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:53:50.563 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:53:50.564 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:53:50.564 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:50.564 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:53:50.564 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:53:50.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:53:50.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:53:50.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:53:50.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:53:50.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:53:50.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:53:50.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:53:50.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:53:50.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:53:50.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:53:50.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:53:50.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:53:50.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/3 2026-03-08T22:53:50.566 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:53:50.567 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=08ec3d89-0a83-45bb-b854-3c284cc805f9 2026-03-08T22:53:50.567 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 08ec3d89-0a83-45bb-b854-3c284cc805f9' 2026-03-08T22:53:50.567 INFO:tasks.workunit.client.0.vm04.stdout:add osd3 08ec3d89-0a83-45bb-b854-3c284cc805f9 2026-03-08T22:53:50.568 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:53:50.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQD+/a1pLIbJIhAAWDzODxBrdRjcDxC5hcWLbg== 2026-03-08T22:53:50.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQD+/a1pLIbJIhAAWDzODxBrdRjcDxC5hcWLbg=="}' 2026-03-08T22:53:50.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 08ec3d89-0a83-45bb-b854-3c284cc805f9 -i td/test-erasure-eio/3/new.json 2026-03-08T22:53:50.829 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:53:50.840 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/3/new.json 2026-03-08T22:53:50.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQD+/a1pLIbJIhAAWDzODxBrdRjcDxC5hcWLbg== --osd-uuid 08ec3d89-0a83-45bb-b854-3c284cc805f9 2026-03-08T22:53:50.861 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:50.862+0000 7f026c720780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:50.864 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:50.865+0000 7f026c720780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:50.864 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:50.865+0000 7f026c720780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:50.865 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:50.866+0000 7f026c720780 -1 bdev(0x55aa7b83dc00 td/test-erasure-eio/3/block) open stat got: (1) Operation not permitted 2026-03-08T22:53:50.865 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:50.866+0000 7f026c720780 -1 bluestore(td/test-erasure-eio/3) _read_fsid unparsable uuid 2026-03-08T22:53:53.015 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/3/keyring 2026-03-08T22:53:53.015 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:53:53.016 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T22:53:53.016 INFO:tasks.workunit.client.0.vm04.stdout:adding osd3 key to auth repository 2026-03-08T22:53:53.016 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:53:53.331 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T22:53:53.331 INFO:tasks.workunit.client.0.vm04.stdout:start osd.3 2026-03-08T22:53:53.331 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:53:53.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:53:53.333 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:53:53.335 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:53:53.351 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:53.351+0000 7f7b921d0780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:53.359 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:53.360+0000 7f7b921d0780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:53.360 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:53.361+0000 7f7b921d0780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:53.680 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T22:53:53.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:53:53.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:53:53.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:53:53.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:53:53.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:53.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:53:53.681 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:53:53.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:53.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:53:53.912 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:54.202 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:54.202+0000 7f7b921d0780 -1 Falling back to public interface 2026-03-08T22:53:54.913 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:54.913 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:54.913 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:53:54.913 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:53:54.914 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:54.914 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:53:55.162 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:55.290 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:55.290+0000 7f7b921d0780 -1 osd.3 0 log_to_monitors true 2026-03-08T22:53:56.164 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:53:56.164 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:56.164 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:56.164 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:53:56.165 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:56.165 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:53:56.371 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:56.372+0000 7f7b8d96f640 -1 osd.3 0 waiting for initial osdmap 2026-03-08T22:53:56.454 INFO:tasks.workunit.client.0.vm04.stdout:osd.3 up in weight 1 up_from 27 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/3891897463,v1:127.0.0.1:6827/3891897463] [v2:127.0.0.1:6828/3891897463,v1:127.0.0.1:6829/3891897463] exists,up 08ec3d89-0a83-45bb-b854-3c284cc805f9 2026-03-08T22:53:56.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:53:56.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:53:56.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:53:56.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:53:56.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 4 2026-03-08T22:53:56.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:53:56.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:53:56.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=4 2026-03-08T22:53:56.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:53:56.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/4 2026-03-08T22:53:56.455 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:53:56.455 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:53:56.455 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:53:56.455 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:53:56.455 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/4' 2026-03-08T22:53:56.455 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/4/journal' 2026-03-08T22:53:56.455 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:53:56.455 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:53:56.455 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:53:56.455 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:53:56.455 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:53:56.455 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:53:56.455 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:53:56.455 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:56.455 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:53:56.455 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:53:56.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:53:56.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:53:56.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:53:56.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:53:56.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:53:56.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:53:56.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:53:56.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:53:56.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:53:56.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:53:56.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:53:56.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:53:56.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/4 2026-03-08T22:53:56.457 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:53:56.458 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=d8bf9d30-bdd7-4963-ab57-87c919386c97 2026-03-08T22:53:56.458 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd4 d8bf9d30-bdd7-4963-ab57-87c919386c97' 2026-03-08T22:53:56.458 INFO:tasks.workunit.client.0.vm04.stdout:add osd4 d8bf9d30-bdd7-4963-ab57-87c919386c97 2026-03-08T22:53:56.458 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:53:56.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAE/q1pga1gHBAAWjCjWTBdwZy0N+z0MGGmJw== 2026-03-08T22:53:56.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAE/q1pga1gHBAAWjCjWTBdwZy0N+z0MGGmJw=="}' 2026-03-08T22:53:56.475 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new d8bf9d30-bdd7-4963-ab57-87c919386c97 -i td/test-erasure-eio/4/new.json 2026-03-08T22:53:56.737 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-08T22:53:56.747 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/4/new.json 2026-03-08T22:53:56.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 4 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/4 --osd-journal=td/test-erasure-eio/4/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAE/q1pga1gHBAAWjCjWTBdwZy0N+z0MGGmJw== --osd-uuid d8bf9d30-bdd7-4963-ab57-87c919386c97 2026-03-08T22:53:56.767 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:56.768+0000 7f0a64815780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:56.769 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:56.770+0000 7f0a64815780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:56.770 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:56.771+0000 7f0a64815780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:56.771 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:56.772+0000 7f0a64815780 -1 bdev(0x55ea4335bc00 td/test-erasure-eio/4/block) open stat got: (1) Operation not permitted 2026-03-08T22:53:56.771 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:56.772+0000 7f0a64815780 -1 bluestore(td/test-erasure-eio/4) _read_fsid unparsable uuid 2026-03-08T22:53:59.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/4/keyring 2026-03-08T22:53:59.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:53:59.527 INFO:tasks.workunit.client.0.vm04.stdout:adding osd4 key to auth repository 2026-03-08T22:53:59.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd4 key to auth repository 2026-03-08T22:53:59.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/4/keyring auth add osd.4 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:53:59.833 INFO:tasks.workunit.client.0.vm04.stdout:start osd.4 2026-03-08T22:53:59.834 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.4 2026-03-08T22:53:59.834 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 4 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/4 --osd-journal=td/test-erasure-eio/4/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:53:59.834 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:53:59.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:53:59.837 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:53:59.854 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:59.853+0000 7f1432011780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:59.855 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:59.856+0000 7f1432011780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:59.857 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:53:59.857+0000 7f1432011780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:00.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 4 2026-03-08T22:54:00.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:54:00.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=4 2026-03-08T22:54:00.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:54:00.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:54:00.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:00.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:54:00.068 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:54:00.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:00.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:54:00.305 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:01.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:01.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:01.307 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:54:01.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:54:01.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:01.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:54:01.440 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:01.440+0000 7f1432011780 -1 Falling back to public interface 2026-03-08T22:54:01.550 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:02.299 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:02.299+0000 7f1432011780 -1 osd.4 0 log_to_monitors true 2026-03-08T22:54:02.552 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:54:02.552 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:02.552 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:02.552 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:54:02.553 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:02.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:54:02.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:03.410 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:03.410+0000 7f142d9e7640 -1 osd.4 0 waiting for initial osdmap 2026-03-08T22:54:03.831 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:54:03.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:03.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:03.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:54:03.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:03.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T22:54:04.068 INFO:tasks.workunit.client.0.vm04.stdout:osd.4 up in weight 1 up_from 36 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6834/850446594,v1:127.0.0.1:6835/850446594] [v2:127.0.0.1:6836/850446594,v1:127.0.0.1:6837/850446594] exists,up d8bf9d30-bdd7-4963-ab57-87c919386c97 2026-03-08T22:54:04.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:54:04.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:54:04.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:54:04.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:54:04.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 5 2026-03-08T22:54:04.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:54:04.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:54:04.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=5 2026-03-08T22:54:04.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:54:04.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/5 2026-03-08T22:54:04.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:54:04.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:54:04.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:54:04.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:54:04.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/5' 2026-03-08T22:54:04.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/5/journal' 2026-03-08T22:54:04.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:54:04.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:54:04.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:54:04.070 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:54:04.070 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:04.070 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:04.070 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:04.070 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:04.070 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:54:04.070 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:54:04.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:54:04.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:54:04.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:54:04.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:54:04.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:54:04.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:54:04.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:54:04.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:54:04.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:54:04.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:54:04.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:54:04.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:54:04.071 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/5 2026-03-08T22:54:04.072 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:54:04.073 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=b1c45ac7-9192-49c0-8eb6-a754abc3fdce 2026-03-08T22:54:04.073 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd5 b1c45ac7-9192-49c0-8eb6-a754abc3fdce' 2026-03-08T22:54:04.073 INFO:tasks.workunit.client.0.vm04.stdout:add osd5 b1c45ac7-9192-49c0-8eb6-a754abc3fdce 2026-03-08T22:54:04.073 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:54:04.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAM/q1pY6M9BRAAv+PpUYRbSzdTxsn7in8gLQ== 2026-03-08T22:54:04.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAM/q1pY6M9BRAAv+PpUYRbSzdTxsn7in8gLQ=="}' 2026-03-08T22:54:04.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new b1c45ac7-9192-49c0-8eb6-a754abc3fdce -i td/test-erasure-eio/5/new.json 2026-03-08T22:54:04.388 INFO:tasks.workunit.client.0.vm04.stdout:5 2026-03-08T22:54:04.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/5/new.json 2026-03-08T22:54:04.408 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 5 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/5 --osd-journal=td/test-erasure-eio/5/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAM/q1pY6M9BRAAv+PpUYRbSzdTxsn7in8gLQ== --osd-uuid b1c45ac7-9192-49c0-8eb6-a754abc3fdce 2026-03-08T22:54:04.427 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:04.428+0000 7fc58e291780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:04.429 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:04.430+0000 7fc58e291780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:04.431 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:04.431+0000 7fc58e291780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:04.431 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:04.432+0000 7fc58e291780 -1 bdev(0x563be9327c00 td/test-erasure-eio/5/block) open stat got: (1) Operation not permitted 2026-03-08T22:54:04.432 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:04.432+0000 7fc58e291780 -1 bluestore(td/test-erasure-eio/5) _read_fsid unparsable uuid 2026-03-08T22:54:06.827 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/5/keyring 2026-03-08T22:54:06.827 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:54:06.828 INFO:tasks.workunit.client.0.vm04.stdout:adding osd5 key to auth repository 2026-03-08T22:54:06.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd5 key to auth repository 2026-03-08T22:54:06.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/5/keyring auth add osd.5 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:54:07.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.5 2026-03-08T22:54:07.157 INFO:tasks.workunit.client.0.vm04.stdout:start osd.5 2026-03-08T22:54:07.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 5 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/5 --osd-journal=td/test-erasure-eio/5/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:54:07.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:54:07.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:54:07.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:54:07.176 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:07.177+0000 7f944180e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:07.185 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:07.186+0000 7f944180e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:07.186 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:07.187+0000 7f944180e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:07.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 5 2026-03-08T22:54:07.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:54:07.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=5 2026-03-08T22:54:07.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:54:07.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:54:07.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:07.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:54:07.394 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:54:07.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:07.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:54:07.639 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:07.743 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:07.744+0000 7f944180e780 -1 Falling back to public interface 2026-03-08T22:54:08.641 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:08.641 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:08.641 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:54:08.641 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:54:08.642 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:08.642 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:54:08.874 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:09.102 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:09.102+0000 7f944180e780 -1 osd.5 0 log_to_monitors true 2026-03-08T22:54:09.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:09.876 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:09.876 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:54:09.876 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:54:09.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:09.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:54:10.143 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:11.144 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:11.144 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:11.144 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:54:11.144 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:54:11.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:11.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T22:54:11.378 INFO:tasks.workunit.client.0.vm04.stdout:osd.5 up in weight 1 up_from 44 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6842/2154619563,v1:127.0.0.1:6843/2154619563] [v2:127.0.0.1:6844/2154619563,v1:127.0.0.1:6845/2154619563] exists,up b1c45ac7-9192-49c0-8eb6-a754abc3fdce 2026-03-08T22:54:11.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:54:11.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:54:11.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:54:11.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:54:11.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 6 2026-03-08T22:54:11.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:54:11.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:54:11.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=6 2026-03-08T22:54:11.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:54:11.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/6 2026-03-08T22:54:11.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:54:11.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:54:11.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:54:11.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:54:11.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/6' 2026-03-08T22:54:11.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/6/journal' 2026-03-08T22:54:11.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:54:11.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:54:11.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:54:11.380 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:54:11.380 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:11.380 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:11.380 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:11.380 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:11.380 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:54:11.380 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:54:11.380 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:54:11.380 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:54:11.380 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:54:11.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:54:11.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:54:11.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:54:11.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:54:11.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:54:11.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:54:11.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:54:11.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:54:11.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:54:11.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/6 2026-03-08T22:54:11.382 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:54:11.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=4f6036d9-cab8-4e9f-8801-437837d6be77 2026-03-08T22:54:11.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd6 4f6036d9-cab8-4e9f-8801-437837d6be77' 2026-03-08T22:54:11.383 INFO:tasks.workunit.client.0.vm04.stdout:add osd6 4f6036d9-cab8-4e9f-8801-437837d6be77 2026-03-08T22:54:11.383 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:54:11.401 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAT/q1pF94DGBAAcJyRgZBAstNHzBxtsDJCsQ== 2026-03-08T22:54:11.401 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAT/q1pF94DGBAAcJyRgZBAstNHzBxtsDJCsQ=="}' 2026-03-08T22:54:11.401 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 4f6036d9-cab8-4e9f-8801-437837d6be77 -i td/test-erasure-eio/6/new.json 2026-03-08T22:54:11.646 INFO:tasks.workunit.client.0.vm04.stdout:6 2026-03-08T22:54:11.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/6/new.json 2026-03-08T22:54:11.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 6 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/6 --osd-journal=td/test-erasure-eio/6/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAT/q1pF94DGBAAcJyRgZBAstNHzBxtsDJCsQ== --osd-uuid 4f6036d9-cab8-4e9f-8801-437837d6be77 2026-03-08T22:54:11.689 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:11.690+0000 7fe9d140f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:11.701 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:11.702+0000 7fe9d140f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:11.703 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:11.704+0000 7fe9d140f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:11.703 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:11.704+0000 7fe9d140f780 -1 bdev(0x559b57bd7c00 td/test-erasure-eio/6/block) open stat got: (1) Operation not permitted 2026-03-08T22:54:11.704 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:11.705+0000 7fe9d140f780 -1 bluestore(td/test-erasure-eio/6) _read_fsid unparsable uuid 2026-03-08T22:54:14.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/6/keyring 2026-03-08T22:54:14.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:54:14.105 INFO:tasks.workunit.client.0.vm04.stdout:adding osd6 key to auth repository 2026-03-08T22:54:14.105 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd6 key to auth repository 2026-03-08T22:54:14.105 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/6/keyring auth add osd.6 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:54:14.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.6 2026-03-08T22:54:14.418 INFO:tasks.workunit.client.0.vm04.stdout:start osd.6 2026-03-08T22:54:14.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 6 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/6 --osd-journal=td/test-erasure-eio/6/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:54:14.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:54:14.420 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:54:14.423 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:54:14.444 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:14.443+0000 7f8cd260c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:14.444 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:14.445+0000 7f8cd260c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:14.446 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:14.446+0000 7f8cd260c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:14.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 6 2026-03-08T22:54:14.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:54:14.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=6 2026-03-08T22:54:14.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:54:14.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:54:14.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:14.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:54:14.672 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:54:14.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:14.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:54:14.893 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:15.787 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:15.788+0000 7f8cd260c780 -1 Falling back to public interface 2026-03-08T22:54:15.894 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:15.894 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:15.894 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:54:15.894 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:54:15.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:15.896 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:54:16.139 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:16.899 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:16.900+0000 7f8cd260c780 -1 osd.6 0 log_to_monitors true 2026-03-08T22:54:17.141 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:54:17.141 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:17.141 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:17.141 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:54:17.142 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:17.143 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:54:17.409 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:17.970 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:17.971+0000 7f8ccd65f640 -1 osd.6 0 waiting for initial osdmap 2026-03-08T22:54:18.411 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:54:18.411 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:18.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:18.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:54:18.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:18.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T22:54:18.640 INFO:tasks.workunit.client.0.vm04.stdout:osd.6 up in weight 1 up_from 52 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6850/1413777343,v1:127.0.0.1:6851/1413777343] [v2:127.0.0.1:6852/1413777343,v1:127.0.0.1:6853/1413777343] exists,up 4f6036d9-cab8-4e9f-8801-437837d6be77 2026-03-08T22:54:18.641 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:54:18.641 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:54:18.641 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:54:18.641 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: get_asok_path osd.0 2026-03-08T22:54:18.641 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:54:18.641 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:54:18.641 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:54:18.641 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:18.641 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:54:18.641 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.0.asok 2026-03-08T22:54:18.642 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: CEPH_ARGS= 2026-03-08T22:54:18.642 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.0.asok log flush 2026-03-08T22:54:18.696 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:54:18.703 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:56: setup_osds: grep 'load: jerasure.*lrc' td/test-erasure-eio/osd.0.log 2026-03-08T22:54:18.706 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:53:32.913+0000 7f919540d780 0 load: jerasure load: lrc 2026-03-08T22:54:18.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:395: TEST_ec_single_recovery_error: local poolname=pool-jerasure 2026-03-08T22:54:18.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:396: TEST_ec_single_recovery_error: create_erasure_coded_pool pool-jerasure 3 2 2026-03-08T22:54:18.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:67: create_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:54:18.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:68: create_erasure_coded_pool: shift 2026-03-08T22:54:18.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:69: create_erasure_coded_pool: local k=3 2026-03-08T22:54:18.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:70: create_erasure_coded_pool: shift 2026-03-08T22:54:18.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:71: create_erasure_coded_pool: local m=2 2026-03-08T22:54:18.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:72: create_erasure_coded_pool: shift 2026-03-08T22:54:18.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:74: create_erasure_coded_pool: ceph osd erasure-code-profile set myprofile plugin=jerasure k=3 m=2 crush-failure-domain=osd 2026-03-08T22:54:19.005 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:78: create_erasure_coded_pool: create_pool pool-jerasure 1 1 erasure myprofile 2026-03-08T22:54:19.005 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create pool-jerasure 1 1 erasure myprofile 2026-03-08T22:54:19.376 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' already exists 2026-03-08T22:54:19.389 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:54:20.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:80: create_erasure_coded_pool: wait_for_clean 2026-03-08T22:54:20.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:54:20.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:54:20.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:54:20.391 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:54:20.391 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:54:20.391 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:54:20.391 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:54:20.392 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:54:20.392 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:54:20.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:54:20.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:54:20.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:54:20.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:54:20.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:54:20.473 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:54:20.705 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:54:20.705 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:54:20.705 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:54:20.705 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:54:20.705 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:54:20.705 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:54:20.705 INFO:tasks.workunit.client.0.vm04.stderr:6' 2026-03-08T22:54:20.705 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:54:20.705 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:20.705 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:54:20.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803787 2026-03-08T22:54:20.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803787 2026-03-08T22:54:20.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787' 2026-03-08T22:54:20.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:20.782 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:54:20.869 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574858 2026-03-08T22:54:20.869 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574858 2026-03-08T22:54:20.869 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574858' 2026-03-08T22:54:20.869 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:20.869 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:54:20.955 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378633 2026-03-08T22:54:20.955 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378633 2026-03-08T22:54:20.955 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574858 2-81604378633' 2026-03-08T22:54:20.955 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:20.955 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:54:21.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964116999 2026-03-08T22:54:21.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964116999 2026-03-08T22:54:21.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574858 2-81604378633 3-115964116999' 2026-03-08T22:54:21.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:21.032 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:54:21.122 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=154618822662 2026-03-08T22:54:21.122 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 154618822662 2026-03-08T22:54:21.122 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574858 2-81604378633 3-115964116999 4-154618822662' 2026-03-08T22:54:21.122 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:21.122 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:54:21.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561028 2026-03-08T22:54:21.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561028 2026-03-08T22:54:21.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574858 2-81604378633 3-115964116999 4-154618822662 5-188978561028' 2026-03-08T22:54:21.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:21.216 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T22:54:21.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=223338299395 2026-03-08T22:54:21.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 223338299395 2026-03-08T22:54:21.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574858 2-81604378633 3-115964116999 4-154618822662 5-188978561028 6-223338299395' 2026-03-08T22:54:21.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:21.308 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803787 2026-03-08T22:54:21.308 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:21.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:54:21.309 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803787 2026-03-08T22:54:21.309 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:21.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803787 2026-03-08T22:54:21.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803787' 2026-03-08T22:54:21.310 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803787 2026-03-08T22:54:21.310 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:54:21.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803785 -lt 25769803787 2026-03-08T22:54:21.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:54:22.530 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:54:22.530 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:54:22.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803788 -lt 25769803787 2026-03-08T22:54:22.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:22.766 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574858 2026-03-08T22:54:22.766 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:22.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:54:22.767 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574858 2026-03-08T22:54:22.767 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:22.768 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574858 2026-03-08T22:54:22.768 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574858' 2026-03-08T22:54:22.768 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 55834574858 2026-03-08T22:54:22.768 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:54:23.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574858 -lt 55834574858 2026-03-08T22:54:23.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:23.030 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378633 2026-03-08T22:54:23.031 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:23.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:54:23.032 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378633 2026-03-08T22:54:23.032 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:23.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378633 2026-03-08T22:54:23.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378633' 2026-03-08T22:54:23.033 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 81604378633 2026-03-08T22:54:23.033 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:54:23.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378633 -lt 81604378633 2026-03-08T22:54:23.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:23.262 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964116999 2026-03-08T22:54:23.262 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:23.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:54:23.264 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964116999 2026-03-08T22:54:23.264 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:23.265 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964116999 2026-03-08T22:54:23.265 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 115964116999 2026-03-08T22:54:23.265 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964116999' 2026-03-08T22:54:23.265 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:54:23.490 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964117000 -lt 115964116999 2026-03-08T22:54:23.490 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:23.491 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-154618822662 2026-03-08T22:54:23.491 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:23.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:54:23.493 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-154618822662 2026-03-08T22:54:23.493 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:23.494 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=154618822662 2026-03-08T22:54:23.494 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 154618822662' 2026-03-08T22:54:23.494 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.4 seq 154618822662 2026-03-08T22:54:23.494 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:54:23.721 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 154618822663 -lt 154618822662 2026-03-08T22:54:23.721 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:23.721 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-188978561028 2026-03-08T22:54:23.721 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:23.722 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:54:23.723 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-188978561028 2026-03-08T22:54:23.723 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:23.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561028 2026-03-08T22:54:23.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 188978561028' 2026-03-08T22:54:23.724 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.5 seq 188978561028 2026-03-08T22:54:23.724 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:54:23.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561029 -lt 188978561028 2026-03-08T22:54:23.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:23.944 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-223338299395 2026-03-08T22:54:23.944 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:23.945 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T22:54:23.946 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-223338299395 2026-03-08T22:54:23.946 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:23.947 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=223338299395 2026-03-08T22:54:23.947 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 223338299395' 2026-03-08T22:54:23.947 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.6 seq 223338299395 2026-03-08T22:54:23.948 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T22:54:24.180 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 223338299396 -lt 223338299395 2026-03-08T22:54:24.180 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:54:24.180 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:54:24.181 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:54:24.481 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:54:24.481 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:54:24.481 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:54:24.482 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:54:24.482 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:54:24.482 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:54:24.482 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:54:24.482 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:54:24.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:54:24.714 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:54:24.715 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:54:24.715 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:54:25.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:54:25.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:54:25.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:54:25.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:398: TEST_ec_single_recovery_error: rados_put td/test-erasure-eio pool-jerasure myobject 2026-03-08T22:54:25.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T22:54:25.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T22:54:25.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=myobject 2026-03-08T22:54:25.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:54:25.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T22:54:25.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:54:25.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T22:54:25.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:54:25.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T22:54:25.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:54:25.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T22:54:25.028 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put myobject td/test-erasure-eio/ORIGINAL 2026-03-08T22:54:25.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:399: TEST_ec_single_recovery_error: inject_eio ec data pool-jerasure myobject td/test-erasure-eio 0 2026-03-08T22:54:25.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=ec 2026-03-08T22:54:25.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T22:54:25.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=data 2026-03-08T22:54:25.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T22:54:25.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=pool-jerasure 2026-03-08T22:54:25.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T22:54:25.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=myobject 2026-03-08T22:54:25.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T22:54:25.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/test-erasure-eio 2026-03-08T22:54:25.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T22:54:25.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=0 2026-03-08T22:54:25.110 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T22:54:25.111 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds pool-jerasure myobject 2026-03-08T22:54:25.111 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:54:25.111 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=myobject 2026-03-08T22:54:25.111 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure myobject 2026-03-08T22:54:25.111 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:54:25.342 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:54:25.342 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:54:25.342 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:54:25.342 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:54:25.342 INFO:tasks.workunit.client.0.vm04.stderr:6' 2026-03-08T22:54:25.342 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 5 1 0 6 2026-03-08T22:54:25.342 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('3' '5' '1' '0' '6') 2026-03-08T22:54:25.342 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T22:54:25.342 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=3 2026-03-08T22:54:25.342 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' ec '!=' ec ']' 2026-03-08T22:54:25.343 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/test-erasure-eio/3/type 2026-03-08T22:54:25.344 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T22:54:25.344 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 3 bluestore_debug_inject_read_err true 2026-03-08T22:54:25.344 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T22:54:25.344 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=3 2026-03-08T22:54:25.344 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T22:54:25.344 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T22:54:25.344 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T22:54:25.344 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.3 2026-03-08T22:54:25.345 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.3 2026-03-08T22:54:25.345 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.3 ']' 2026-03-08T22:54:25.345 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:54:25.345 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:25.345 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:54:25.345 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.3.asok 2026-03-08T22:54:25.345 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.102080/ceph-osd.3.asok config set bluestore_debug_inject_read_err true 2026-03-08T22:54:25.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T22:54:25.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T22:54:25.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T22:54:25.409 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.3 2026-03-08T22:54:25.409 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.3 2026-03-08T22:54:25.409 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.3 ']' 2026-03-08T22:54:25.409 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:54:25.409 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:25.409 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:54:25.410 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.3.asok 2026-03-08T22:54:25.410 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T22:54:25.410 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.3.asok injectdataerr pool-jerasure myobject 0 2026-03-08T22:54:25.478 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:401: TEST_ec_single_recovery_error: get_osds pool-jerasure myobject 2026-03-08T22:54:25.479 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:54:25.479 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=myobject 2026-03-08T22:54:25.479 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure myobject 2026-03-08T22:54:25.479 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:54:25.719 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:54:25.719 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:54:25.719 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:54:25.719 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:54:25.719 INFO:tasks.workunit.client.0.vm04.stderr:6' 2026-03-08T22:54:25.719 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 5 1 0 6 2026-03-08T22:54:25.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:401: TEST_ec_single_recovery_error: initial_osds=('3' '5' '1' '0' '6') 2026-03-08T22:54:25.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:401: TEST_ec_single_recovery_error: local -a initial_osds 2026-03-08T22:54:25.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:402: TEST_ec_single_recovery_error: local last_osd=6 2026-03-08T22:54:25.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:404: TEST_ec_single_recovery_error: kill_daemons td/test-erasure-eio TERM osd.6 2026-03-08T22:54:25.719 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:54:25.719 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:54:25.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:54:25.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:54:25.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:54:26.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:54:26.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:405: TEST_ec_single_recovery_error: ceph osd down 6 2026-03-08T22:54:26.255 INFO:tasks.workunit.client.0.vm04.stderr:osd.6 is already down. 2026-03-08T22:54:26.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:406: TEST_ec_single_recovery_error: ceph osd out 6 2026-03-08T22:54:26.516 INFO:tasks.workunit.client.0.vm04.stderr:osd.6 is already out. 2026-03-08T22:54:26.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:409: TEST_ec_single_recovery_error: wait_for_clean 2026-03-08T22:54:26.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:54:26.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:54:26.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:54:26.530 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:54:26.530 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:54:26.530 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:54:26.530 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:54:26.530 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:54:26.530 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:54:26.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:54:26.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:54:26.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:54:26.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:54:26.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:54:26.614 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:54:26.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:54:26.845 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:54:26.845 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:54:26.845 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:54:26.845 INFO:tasks.workunit.client.0.vm04.stderr:4 2026-03-08T22:54:26.845 INFO:tasks.workunit.client.0.vm04.stderr:5 2026-03-08T22:54:26.845 INFO:tasks.workunit.client.0.vm04.stderr:6' 2026-03-08T22:54:26.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:54:26.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:26.845 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:54:26.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803791 2026-03-08T22:54:26.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803791 2026-03-08T22:54:26.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803791' 2026-03-08T22:54:26.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:26.923 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:54:27.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574861 2026-03-08T22:54:27.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574861 2026-03-08T22:54:27.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803791 1-55834574861' 2026-03-08T22:54:27.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:27.007 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:54:27.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378636 2026-03-08T22:54:27.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378636 2026-03-08T22:54:27.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803791 1-55834574861 2-81604378636' 2026-03-08T22:54:27.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:27.089 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:54:27.172 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964117003 2026-03-08T22:54:27.172 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964117003 2026-03-08T22:54:27.172 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803791 1-55834574861 2-81604378636 3-115964117003' 2026-03-08T22:54:27.172 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:27.172 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T22:54:27.251 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=154618822665 2026-03-08T22:54:27.251 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 154618822665 2026-03-08T22:54:27.251 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803791 1-55834574861 2-81604378636 3-115964117003 4-154618822665' 2026-03-08T22:54:27.251 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:27.251 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T22:54:27.336 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561032 2026-03-08T22:54:27.336 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561032 2026-03-08T22:54:27.336 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803791 1-55834574861 2-81604378636 3-115964117003 4-154618822665 5-188978561032' 2026-03-08T22:54:27.336 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:27.336 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T22:54:27.392 INFO:tasks.workunit.client.0.vm04.stderr:Error ENXIO: problem getting command descriptions from osd.6 2026-03-08T22:54:27.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq= 2026-03-08T22:54:27.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z '' 2026-03-08T22:54:27.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2268: flush_pg_stats: continue 2026-03-08T22:54:27.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:27.394 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803791 2026-03-08T22:54:27.394 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:27.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:54:27.395 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803791 2026-03-08T22:54:27.396 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:27.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803791 2026-03-08T22:54:27.396 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803791 2026-03-08T22:54:27.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803791' 2026-03-08T22:54:27.397 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:54:27.619 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803788 -lt 25769803791 2026-03-08T22:54:27.619 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:54:28.620 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:54:28.621 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:54:28.869 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803791 -lt 25769803791 2026-03-08T22:54:28.869 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:28.869 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574861 2026-03-08T22:54:28.869 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:28.871 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:54:28.871 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574861 2026-03-08T22:54:28.871 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:28.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574861 2026-03-08T22:54:28.872 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 55834574861 2026-03-08T22:54:28.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574861' 2026-03-08T22:54:28.873 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:54:29.102 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574861 -lt 55834574861 2026-03-08T22:54:29.102 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:29.102 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378636 2026-03-08T22:54:29.102 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:29.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:54:29.103 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378636 2026-03-08T22:54:29.104 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:29.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378636 2026-03-08T22:54:29.105 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 81604378636 2026-03-08T22:54:29.105 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378636' 2026-03-08T22:54:29.105 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:54:29.342 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378636 -lt 81604378636 2026-03-08T22:54:29.342 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:29.342 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964117003 2026-03-08T22:54:29.343 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:29.344 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:54:29.344 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964117003 2026-03-08T22:54:29.344 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:29.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964117003 2026-03-08T22:54:29.346 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 115964117003 2026-03-08T22:54:29.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964117003' 2026-03-08T22:54:29.346 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:54:29.578 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964117003 -lt 115964117003 2026-03-08T22:54:29.578 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:29.578 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-154618822665 2026-03-08T22:54:29.578 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:29.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T22:54:29.580 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-154618822665 2026-03-08T22:54:29.580 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:29.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=154618822665 2026-03-08T22:54:29.581 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.4 seq 154618822665 2026-03-08T22:54:29.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 154618822665' 2026-03-08T22:54:29.582 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T22:54:29.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 154618822666 -lt 154618822665 2026-03-08T22:54:29.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:29.826 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-188978561032 2026-03-08T22:54:29.827 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:29.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T22:54:29.828 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-188978561032 2026-03-08T22:54:29.828 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:29.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561032 2026-03-08T22:54:29.829 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.5 seq 188978561032 2026-03-08T22:54:29.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 188978561032' 2026-03-08T22:54:29.830 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T22:54:30.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561032 -lt 188978561032 2026-03-08T22:54:30.051 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:54:30.051 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:54:30.051 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:54:30.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:54:30.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:54:30.356 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:54:30.356 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:54:30.356 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:54:30.356 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:54:30.356 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:54:30.356 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:54:30.601 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=3 2026-03-08T22:54:30.601 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:54:30.601 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:54:30.601 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:54:30.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 3 = 5 2026-03-08T22:54:30.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 3 '!=' -1 2026-03-08T22:54:30.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1674: wait_for_clean: loop=0 2026-03-08T22:54:30.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1675: wait_for_clean: num_active_clean=3 2026-03-08T22:54:30.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:54:30.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:54:31.003 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:54:31.003 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:54:31.003 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:54:31.003 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:54:31.003 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:54:31.003 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:54:31.004 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:54:31.004 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:54:31.222 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=3 2026-03-08T22:54:31.222 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:54:31.222 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:54:31.222 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:54:31.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 3 = 5 2026-03-08T22:54:31.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 3 '!=' 3 2026-03-08T22:54:31.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:54:31.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:54:31.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:54:31.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:54:31.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:54:31.551 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:54:31.552 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:54:31.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=1023 2026-03-08T22:54:31.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test 1023 '!=' null 2026-03-08T22:54:31.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1677: wait_for_clean: loop=0 2026-03-08T22:54:31.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:54:31.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:54:31.958 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:54:31.958 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:54:31.958 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:54:31.958 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:54:31.958 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:54:31.958 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:54:31.958 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:54:31.959 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:54:32.173 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:54:32.173 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:54:32.173 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:54:32.174 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:54:32.487 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:54:32.487 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:54:32.487 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:54:32.487 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:411: TEST_ec_single_recovery_error: rados_get td/test-erasure-eio pool-jerasure myobject 2026-03-08T22:54:32.488 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:54:32.488 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:54:32.488 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=myobject 2026-03-08T22:54:32.488 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=ok 2026-03-08T22:54:32.488 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' ok = fail ']' 2026-03-08T22:54:32.488 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:120: rados_get: rados --pool pool-jerasure get myobject td/test-erasure-eio/COPY 2026-03-08T22:54:32.515 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:121: rados_get: diff td/test-erasure-eio/ORIGINAL td/test-erasure-eio/COPY 2026-03-08T22:54:32.516 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:122: rados_get: rm td/test-erasure-eio/COPY 2026-03-08T22:54:32.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:413: TEST_ec_single_recovery_error: delete_erasure_coded_pool pool-jerasure 2026-03-08T22:54:32.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:84: delete_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:54:32.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:85: delete_erasure_coded_pool: ceph osd pool delete pool-jerasure pool-jerasure --yes-i-really-really-mean-it 2026-03-08T22:54:32.788 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' does not exist 2026-03-08T22:54:32.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:86: delete_erasure_coded_pool: ceph osd erasure-code-profile rm myprofile 2026-03-08T22:54:33.106 INFO:tasks.workunit.client.0.vm04.stderr:erasure-code-profile myprofile does not exist 2026-03-08T22:54:33.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:42: run: teardown td/test-erasure-eio 2026-03-08T22:54:33.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:54:33.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:54:33.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:54:33.118 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:54:33.118 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:54:33.118 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:54:33.118 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:54:33.118 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:54:33.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:54:33.244 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:54:33.245 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:54:33.246 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:54:33.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:54:33.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:54:33.247 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:54:33.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:54:33.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:54:33.248 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:54:33.248 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:54:33.249 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:54:33.249 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:54:33.249 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:54:33.280 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:54:33.280 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:33.280 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:54:33.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:54:33.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:54:33.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:54:33.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:32: run: for func in $funcs 2026-03-08T22:54:33.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:33: run: setup td/test-erasure-eio 2026-03-08T22:54:33.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-eio 2026-03-08T22:54:33.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-eio 2026-03-08T22:54:33.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:54:33.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:54:33.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:54:33.285 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:54:33.285 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:54:33.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:54:33.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:54:33.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:54:33.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:54:33.286 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:54:33.287 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:54:33.287 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:54:33.289 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:54:33.289 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:54:33.289 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:54:33.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:54:33.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:54:33.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:54:33.291 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:54:33.291 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:54:33.293 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:54:33.293 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:54:33.294 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:54:33.294 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:33.294 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:54:33.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:54:33.295 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:54:33.295 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:54:33.295 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-eio 2026-03-08T22:54:33.296 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:54:33.296 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:33.297 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:54:33.297 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.102080 2026-03-08T22:54:33.298 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:54:33.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:54:33.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:54:33.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-eio 1' TERM HUP INT 2026-03-08T22:54:33.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:34: run: run_mon td/test-erasure-eio a 2026-03-08T22:54:33.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-eio 2026-03-08T22:54:33.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:54:33.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:54:33.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:54:33.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-eio/a 2026-03-08T22:54:33.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-eio/a --run-dir=td/test-erasure-eio 2026-03-08T22:54:33.331 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:54:33.331 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:33.331 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:33.331 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:33.332 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:33.332 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:54:33.332 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:54:33.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-eio/a '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-eio/log --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:54:33.364 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:54:33.365 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:54:33.365 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:54:33.365 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:54:33.365 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:54:33.365 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:54:33.365 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:54:33.365 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:54:33.365 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:54:33.367 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:54:33.368 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:33.368 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:54:33.368 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:54:33.368 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:54:33.368 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get fsid 2026-03-08T22:54:33.422 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:54:33.422 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:54:33.422 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:54:33.422 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:54:33.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:54:33.423 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:54:33.423 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:54:33.423 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:54:33.423 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:54:33.423 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:33.423 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:54:33.423 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:54:33.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:54:33.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get mon_host 2026-03-08T22:54:33.478 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:35: run: run_mgr td/test-erasure-eio x 2026-03-08T22:54:33.478 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-eio 2026-03-08T22:54:33.478 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:54:33.478 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:54:33.478 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:54:33.478 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-eio/x 2026-03-08T22:54:33.478 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:54:33.602 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:54:33.602 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:33.602 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:33.602 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:33.602 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:33.602 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:54:33.602 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:54:33.603 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:54:33.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-eio/x '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:54:33.627 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:36: run: create_pool rbd 4 2026-03-08T22:54:33.627 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:54:33.750 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd' already exists 2026-03-08T22:54:33.760 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:54:34.761 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: get_asok_path mon.a 2026-03-08T22:54:34.762 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:54:34.762 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:54:34.762 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:54:34.762 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:34.762 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:54:34.762 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:54:34.762 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: CEPH_ARGS= 2026-03-08T22:54:34.762 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-mon.a.asok log flush 2026-03-08T22:54:34.810 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:54:34.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:40: run: grep 'load: jerasure.*lrc' td/test-erasure-eio/mon.a.log 2026-03-08T22:54:34.819 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:54:33.355+0000 7f28c13cdd80 0 load: jerasure load: lrc 2026-03-08T22:54:34.819 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:41: run: TEST_rados_get_bad_size_shard_0 td/test-erasure-eio 2026-03-08T22:54:34.819 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:302: TEST_rados_get_bad_size_shard_0: local dir=td/test-erasure-eio 2026-03-08T22:54:34.819 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:303: TEST_rados_get_bad_size_shard_0: setup_osds 4 2026-03-08T22:54:34.819 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:47: setup_osds: local count=4 2026-03-08T22:54:34.819 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:48: setup_osds: shift 2026-03-08T22:54:34.820 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: expr 4 - 1 2026-03-08T22:54:34.821 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: seq 0 3 2026-03-08T22:54:34.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:54:34.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 0 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/0 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/0' 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/0/journal' 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:54:34.822 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:54:34.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:54:34.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:54:34.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:54:34.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:54:34.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:54:34.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:54:34.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:54:34.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:54:34.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:54:34.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:54:34.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:54:34.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:54:34.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/0 2026-03-08T22:54:34.824 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:54:34.825 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=cfcf42cc-b5c9-462b-969b-e010fd36e8a2 2026-03-08T22:54:34.825 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 cfcf42cc-b5c9-462b-969b-e010fd36e8a2 2026-03-08T22:54:34.825 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 cfcf42cc-b5c9-462b-969b-e010fd36e8a2' 2026-03-08T22:54:34.825 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:54:34.839 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAq/q1p+HoXMhAAAZ81RBZafQvoKS79lO693g== 2026-03-08T22:54:34.839 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAq/q1p+HoXMhAAAZ81RBZafQvoKS79lO693g=="}' 2026-03-08T22:54:34.839 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new cfcf42cc-b5c9-462b-969b-e010fd36e8a2 -i td/test-erasure-eio/0/new.json 2026-03-08T22:54:34.972 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:54:34.984 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/0/new.json 2026-03-08T22:54:34.985 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAq/q1p+HoXMhAAAZ81RBZafQvoKS79lO693g== --osd-uuid cfcf42cc-b5c9-462b-969b-e010fd36e8a2 2026-03-08T22:54:35.009 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:35.008+0000 7f9fd540d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:35.011 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:35.012+0000 7f9fd540d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:35.015 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:35.016+0000 7f9fd540d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:35.015 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:35.016+0000 7f9fd540d780 -1 bdev(0x561a3f126800 td/test-erasure-eio/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:54:35.015 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:35.016+0000 7f9fd540d780 -1 bluestore(td/test-erasure-eio/0) _read_fsid unparsable uuid 2026-03-08T22:54:37.665 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/0/keyring 2026-03-08T22:54:37.665 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:54:37.666 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:54:37.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:54:37.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:54:37.971 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:54:37.971 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:54:37.971 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:54:37.971 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:54:37.972 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:54:37.974 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:54:37.991 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:37.991+0000 7f300fa14780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:37.993 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:37.994+0000 7f300fa14780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:37.995 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:37.995+0000 7f300fa14780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:38.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:54:38.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:54:38.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:54:38.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:54:38.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:54:38.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:38.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:54:38.201 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:54:38.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:38.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:38.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:38.814 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:38.814+0000 7f300fa14780 -1 Falling back to public interface 2026-03-08T22:54:39.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:39.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:39.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:54:39.428 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:54:39.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:39.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:39.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:39.695 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:39.695+0000 7f300fa14780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:54:40.637 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:54:40.637 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:40.637 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:40.637 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:54:40.637 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:40.637 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:40.876 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 6 up_thru 6 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/2537310598,v1:127.0.0.1:6803/2537310598] [v2:127.0.0.1:6804/2537310598,v1:127.0.0.1:6805/2537310598] exists,up cfcf42cc-b5c9-462b-969b-e010fd36e8a2 2026-03-08T22:54:40.876 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:54:40.876 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:54:40.876 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:54:40.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:54:40.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 1 2026-03-08T22:54:40.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:54:40.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:54:40.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:54:40.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:54:40.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T22:54:40.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:54:40.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:54:40.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:54:40.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:54:40.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T22:54:40.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T22:54:40.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:54:40.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:54:40.877 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:54:40.877 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:54:40.877 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:40.877 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:40.878 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:40.878 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:40.878 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:54:40.878 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:54:40.878 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:54:40.878 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:54:40.878 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:54:40.878 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:54:40.878 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:54:40.878 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:54:40.878 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:54:40.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:54:40.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:54:40.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:54:40.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:54:40.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:54:40.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T22:54:40.880 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:54:40.880 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=d0f972bc-7f7b-4a97-a6fb-5b9afda8f701 2026-03-08T22:54:40.880 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 d0f972bc-7f7b-4a97-a6fb-5b9afda8f701' 2026-03-08T22:54:40.881 INFO:tasks.workunit.client.0.vm04.stdout:add osd1 d0f972bc-7f7b-4a97-a6fb-5b9afda8f701 2026-03-08T22:54:40.881 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:54:40.894 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAw/q1pTzNhNRAAwUJoIEH35xDq97uY/IEd1Q== 2026-03-08T22:54:40.894 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAw/q1pTzNhNRAAwUJoIEH35xDq97uY/IEd1Q=="}' 2026-03-08T22:54:40.894 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new d0f972bc-7f7b-4a97-a6fb-5b9afda8f701 -i td/test-erasure-eio/1/new.json 2026-03-08T22:54:41.153 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:54:41.169 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/1/new.json 2026-03-08T22:54:41.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAw/q1pTzNhNRAAwUJoIEH35xDq97uY/IEd1Q== --osd-uuid d0f972bc-7f7b-4a97-a6fb-5b9afda8f701 2026-03-08T22:54:41.196 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:41.196+0000 7f762db92780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:41.198 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:41.199+0000 7f762db92780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:41.200 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:41.201+0000 7f762db92780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:41.200 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:41.201+0000 7f762db92780 -1 bdev(0x56409c4c5c00 td/test-erasure-eio/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:54:41.200 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:41.201+0000 7f762db92780 -1 bluestore(td/test-erasure-eio/1) _read_fsid unparsable uuid 2026-03-08T22:54:43.400 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/1/keyring 2026-03-08T22:54:43.400 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:54:43.401 INFO:tasks.workunit.client.0.vm04.stdout:adding osd1 key to auth repository 2026-03-08T22:54:43.401 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:54:43.401 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:54:43.712 INFO:tasks.workunit.client.0.vm04.stdout:start osd.1 2026-03-08T22:54:43.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:54:43.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:54:43.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:54:43.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:54:43.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:54:43.731 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:43.732+0000 7f63c141e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:43.739 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:43.740+0000 7f63c141e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:43.741 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:43.741+0000 7f63c141e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:43.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:54:43.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:54:43.962 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:54:43.963 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:54:43.963 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:54:43.963 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:54:43.963 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:43.963 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:54:43.963 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:43.963 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:54:44.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:44.803 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:44.804+0000 7f63c141e780 -1 Falling back to public interface 2026-03-08T22:54:45.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:45.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:45.203 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:54:45.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:54:45.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:45.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:54:45.433 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:45.649 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:45.650+0000 7f63c141e780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:54:46.436 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:54:46.436 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:46.436 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:46.436 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:54:46.436 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:46.436 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:54:46.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:47.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:47.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:47.698 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:54:47.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:54:47.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:47.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:54:47.995 INFO:tasks.workunit.client.0.vm04.stdout:osd.1 up in weight 1 up_from 13 up_thru 13 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/2514294116,v1:127.0.0.1:6811/2514294116] [v2:127.0.0.1:6812/2514294116,v1:127.0.0.1:6813/2514294116] exists,up d0f972bc-7f7b-4a97-a6fb-5b9afda8f701 2026-03-08T22:54:47.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:54:47.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:54:47.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:54:47.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:54:47.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 2 2026-03-08T22:54:47.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:54:47.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:54:47.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:54:47.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:54:47.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T22:54:47.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:54:47.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:54:47.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:54:47.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:54:47.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T22:54:47.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T22:54:47.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:54:47.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:54:47.996 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:54:47.997 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:54:47.997 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:47.997 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:47.997 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:47.997 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:47.997 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:54:47.997 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:54:47.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:54:47.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:54:47.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:54:47.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:54:47.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:54:47.998 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:54:47.998 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:54:47.998 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:54:47.998 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:54:47.998 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:54:47.998 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:54:47.998 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:54:47.998 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T22:54:48.001 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:54:48.001 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=530a07ac-3b11-438b-98fa-0c31c8397c9e 2026-03-08T22:54:48.001 INFO:tasks.workunit.client.0.vm04.stdout:add osd2 530a07ac-3b11-438b-98fa-0c31c8397c9e 2026-03-08T22:54:48.002 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 530a07ac-3b11-438b-98fa-0c31c8397c9e' 2026-03-08T22:54:48.002 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:54:48.017 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQA4/q1p0EUHARAAvuLi9mn+Jqjhwr7jKR21lg== 2026-03-08T22:54:48.017 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQA4/q1p0EUHARAAvuLi9mn+Jqjhwr7jKR21lg=="}' 2026-03-08T22:54:48.017 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 530a07ac-3b11-438b-98fa-0c31c8397c9e -i td/test-erasure-eio/2/new.json 2026-03-08T22:54:48.241 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:54:48.253 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/2/new.json 2026-03-08T22:54:48.253 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQA4/q1p0EUHARAAvuLi9mn+Jqjhwr7jKR21lg== --osd-uuid 530a07ac-3b11-438b-98fa-0c31c8397c9e 2026-03-08T22:54:48.273 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:48.274+0000 7fb0ea4d9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:48.275 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:48.276+0000 7fb0ea4d9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:48.276 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:48.277+0000 7fb0ea4d9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:48.276 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:48.277+0000 7fb0ea4d9780 -1 bdev(0x55cbf1411c00 td/test-erasure-eio/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:54:48.277 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:48.278+0000 7fb0ea4d9780 -1 bluestore(td/test-erasure-eio/2) _read_fsid unparsable uuid 2026-03-08T22:54:50.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/2/keyring 2026-03-08T22:54:50.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:54:50.662 INFO:tasks.workunit.client.0.vm04.stdout:adding osd2 key to auth repository 2026-03-08T22:54:50.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:54:50.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:54:50.967 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T22:54:50.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:54:50.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:54:50.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:54:50.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:54:50.970 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:54:50.990 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:50.991+0000 7f8cf0a2c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:50.992 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:50.993+0000 7f8cf0a2c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:50.994 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:50.994+0000 7f8cf0a2c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:51.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:54:51.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:54:51.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:54:51.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:54:51.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:54:51.201 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:54:51.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:51.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:54:51.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:51.201 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:54:51.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:52.326 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:52.326+0000 7f8cf0a2c780 -1 Falling back to public interface 2026-03-08T22:54:52.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:52.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:52.427 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:54:52.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:54:52.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:52.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:54:52.658 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:53.181 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:53.182+0000 7f8cf0a2c780 -1 osd.2 0 log_to_monitors true 2026-03-08T22:54:53.660 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:54:53.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:53.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:53.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:54:53.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:53.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:54:53.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:54.381 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:54.382+0000 7f8ceb77b640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T22:54:54.940 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:54:54.940 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:54.940 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:54.940 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:54:54.940 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:54.940 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:54:55.169 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up in weight 1 up_from 19 up_thru 19 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/60623555,v1:127.0.0.1:6819/60623555] [v2:127.0.0.1:6820/60623555,v1:127.0.0.1:6821/60623555] exists,up 530a07ac-3b11-438b-98fa-0c31c8397c9e 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 3 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/3 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/3' 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/3/journal' 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:55.170 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:55.171 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:55.171 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:55.171 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:54:55.171 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:54:55.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:54:55.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:54:55.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:54:55.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:54:55.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:54:55.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:54:55.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:54:55.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:54:55.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:54:55.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:54:55.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:54:55.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:54:55.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/3 2026-03-08T22:54:55.172 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:54:55.173 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=a0701ab8-3ad4-4bcd-b0e4-fd484a7aa23c 2026-03-08T22:54:55.173 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 a0701ab8-3ad4-4bcd-b0e4-fd484a7aa23c' 2026-03-08T22:54:55.173 INFO:tasks.workunit.client.0.vm04.stdout:add osd3 a0701ab8-3ad4-4bcd-b0e4-fd484a7aa23c 2026-03-08T22:54:55.173 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:54:55.187 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQA//q1pAWs2CxAAgsMjVI3kt5FJ31O1SaCqdA== 2026-03-08T22:54:55.187 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQA//q1pAWs2CxAAgsMjVI3kt5FJ31O1SaCqdA=="}' 2026-03-08T22:54:55.187 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new a0701ab8-3ad4-4bcd-b0e4-fd484a7aa23c -i td/test-erasure-eio/3/new.json 2026-03-08T22:54:55.419 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:54:55.433 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/3/new.json 2026-03-08T22:54:55.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQA//q1pAWs2CxAAgsMjVI3kt5FJ31O1SaCqdA== --osd-uuid a0701ab8-3ad4-4bcd-b0e4-fd484a7aa23c 2026-03-08T22:54:55.454 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:55.454+0000 7f104f42e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:55.455 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:55.457+0000 7f104f42e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:55.456 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:55.457+0000 7f104f42e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:55.457 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:55.458+0000 7f104f42e780 -1 bdev(0x55c4e23b3c00 td/test-erasure-eio/3/block) open stat got: (1) Operation not permitted 2026-03-08T22:54:55.457 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:55.458+0000 7f104f42e780 -1 bluestore(td/test-erasure-eio/3) _read_fsid unparsable uuid 2026-03-08T22:54:58.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/3/keyring 2026-03-08T22:54:58.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:54:58.039 INFO:tasks.workunit.client.0.vm04.stdout:adding osd3 key to auth repository 2026-03-08T22:54:58.039 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T22:54:58.039 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:54:58.351 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T22:54:58.351 INFO:tasks.workunit.client.0.vm04.stdout:start osd.3 2026-03-08T22:54:58.351 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:54:58.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:54:58.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:54:58.354 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:54:58.372 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:58.371+0000 7f9845320780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:58.372 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:58.373+0000 7f9845320780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:58.375 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:58.375+0000 7f9845320780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:58.588 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T22:54:58.588 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:54:58.588 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:54:58.588 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:54:58.588 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:54:58.588 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:58.588 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:54:58.588 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:54:58.588 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:58.588 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:54:58.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:58.939 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:54:58.940+0000 7f9845320780 -1 Falling back to public interface 2026-03-08T22:54:59.854 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:54:59.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:59.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:59.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:54:59.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:59.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:55:00.120 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:00.337 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:00.338+0000 7f9845320780 -1 osd.3 0 log_to_monitors true 2026-03-08T22:55:01.123 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:55:01.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:01.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:01.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:55:01.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:01.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:55:01.373 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:02.377 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:55:02.377 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:02.377 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:02.377 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:55:02.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:02.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:55:02.649 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:02.650+0000 7f9840abf640 -1 osd.3 0 waiting for initial osdmap 2026-03-08T22:55:02.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:03.658 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:03.658 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:03.658 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:55:03.658 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-08T22:55:03.659 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:03.659 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:55:03.932 INFO:tasks.workunit.client.0.vm04.stdout:osd.3 up in weight 1 up_from 27 up_thru 28 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/1569631231,v1:127.0.0.1:6827/1569631231] [v2:127.0.0.1:6828/1569631231,v1:127.0.0.1:6829/1569631231] exists,up a0701ab8-3ad4-4bcd-b0e4-fd484a7aa23c 2026-03-08T22:55:03.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:55:03.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:55:03.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:55:03.932 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: get_asok_path osd.0 2026-03-08T22:55:03.933 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:55:03.933 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:55:03.933 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:55:03.933 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:03.933 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:55:03.934 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.0.asok 2026-03-08T22:55:03.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: CEPH_ARGS= 2026-03-08T22:55:03.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.0.asok log flush 2026-03-08T22:55:03.986 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:55:03.992 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:56: setup_osds: grep 'load: jerasure.*lrc' td/test-erasure-eio/osd.0.log 2026-03-08T22:55:03.994 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:54:38.817+0000 7f300fa14780 0 load: jerasure load: lrc 2026-03-08T22:55:03.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:305: TEST_rados_get_bad_size_shard_0: local poolname=pool-jerasure 2026-03-08T22:55:03.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:306: TEST_rados_get_bad_size_shard_0: create_erasure_coded_pool pool-jerasure 2 1 2026-03-08T22:55:03.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:67: create_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:55:03.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:68: create_erasure_coded_pool: shift 2026-03-08T22:55:03.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:69: create_erasure_coded_pool: local k=2 2026-03-08T22:55:03.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:70: create_erasure_coded_pool: shift 2026-03-08T22:55:03.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:71: create_erasure_coded_pool: local m=1 2026-03-08T22:55:03.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:72: create_erasure_coded_pool: shift 2026-03-08T22:55:03.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:74: create_erasure_coded_pool: ceph osd erasure-code-profile set myprofile plugin=jerasure k=2 m=1 crush-failure-domain=osd 2026-03-08T22:55:04.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:78: create_erasure_coded_pool: create_pool pool-jerasure 1 1 erasure myprofile 2026-03-08T22:55:04.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create pool-jerasure 1 1 erasure myprofile 2026-03-08T22:55:04.664 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' already exists 2026-03-08T22:55:04.674 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:55:05.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:80: create_erasure_coded_pool: wait_for_clean 2026-03-08T22:55:05.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:55:05.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:55:05.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:55:05.676 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:55:05.676 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:55:05.676 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:55:05.677 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:55:05.677 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:55:05.677 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:55:05.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:55:05.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:55:05.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:55:05.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:55:05.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:55:05.752 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:55:05.975 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:55:05.975 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:55:05.975 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:55:05.975 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:55:05.975 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:55:05.975 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:05.975 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:55:06.061 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803784 2026-03-08T22:55:06.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803784 2026-03-08T22:55:06.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803784' 2026-03-08T22:55:06.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:06.062 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:55:06.137 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574854 2026-03-08T22:55:06.137 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574854 2026-03-08T22:55:06.137 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803784 1-55834574854' 2026-03-08T22:55:06.137 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:06.138 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:55:06.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378629 2026-03-08T22:55:06.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378629 2026-03-08T22:55:06.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803784 1-55834574854 2-81604378629' 2026-03-08T22:55:06.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:06.218 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:55:06.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964116995 2026-03-08T22:55:06.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964116995 2026-03-08T22:55:06.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803784 1-55834574854 2-81604378629 3-115964116995' 2026-03-08T22:55:06.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:06.295 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803784 2026-03-08T22:55:06.295 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:06.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:55:06.296 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803784 2026-03-08T22:55:06.296 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:06.297 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803784 2026-03-08T22:55:06.297 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803784' 2026-03-08T22:55:06.297 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803784 2026-03-08T22:55:06.297 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:06.534 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803782 -lt 25769803784 2026-03-08T22:55:06.534 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:55:07.536 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:55:07.536 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:07.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803782 -lt 25769803784 2026-03-08T22:55:07.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:55:08.807 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:55:08.807 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:09.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803784 -lt 25769803784 2026-03-08T22:55:09.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:09.050 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574854 2026-03-08T22:55:09.050 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:09.051 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:55:09.051 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574854 2026-03-08T22:55:09.051 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:09.052 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574854 2026-03-08T22:55:09.052 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 55834574854 2026-03-08T22:55:09.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574854' 2026-03-08T22:55:09.053 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:55:09.281 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574855 -lt 55834574854 2026-03-08T22:55:09.281 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:09.281 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378629 2026-03-08T22:55:09.281 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:09.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:55:09.282 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378629 2026-03-08T22:55:09.283 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:09.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378629 2026-03-08T22:55:09.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378629' 2026-03-08T22:55:09.284 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 81604378629 2026-03-08T22:55:09.284 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:55:09.528 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378629 -lt 81604378629 2026-03-08T22:55:09.528 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:09.529 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964116995 2026-03-08T22:55:09.529 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:09.530 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:55:09.531 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964116995 2026-03-08T22:55:09.531 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:09.532 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964116995 2026-03-08T22:55:09.532 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964116995' 2026-03-08T22:55:09.532 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 115964116995 2026-03-08T22:55:09.532 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:55:09.770 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964116995 -lt 115964116995 2026-03-08T22:55:09.770 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:55:09.771 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:09.771 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:10.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:55:10.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:55:10.089 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:55:10.089 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:55:10.089 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:55:10.089 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:55:10.090 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:55:10.090 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:55:10.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:55:10.329 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:55:10.329 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:10.330 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:10.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:55:10.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:55:10.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:55:10.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:308: TEST_rados_get_bad_size_shard_0: local shard_id=0 2026-03-08T22:55:10.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:309: TEST_rados_get_bad_size_shard_0: rados_get_data_bad_size td/test-erasure-eio 0 10 2026-03-08T22:55:10.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:222: rados_get_data_bad_size: local dir=td/test-erasure-eio 2026-03-08T22:55:10.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:223: rados_get_data_bad_size: shift 2026-03-08T22:55:10.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:224: rados_get_data_bad_size: local shard_id=0 2026-03-08T22:55:10.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:225: rados_get_data_bad_size: shift 2026-03-08T22:55:10.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:226: rados_get_data_bad_size: local bytes=10 2026-03-08T22:55:10.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:227: rados_get_data_bad_size: shift 2026-03-08T22:55:10.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:228: rados_get_data_bad_size: local mode=set 2026-03-08T22:55:10.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:230: rados_get_data_bad_size: local poolname=pool-jerasure 2026-03-08T22:55:10.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:231: rados_get_data_bad_size: local objname=obj-size-102080-0-10 2026-03-08T22:55:10.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:232: rados_get_data_bad_size: rados_put td/test-erasure-eio pool-jerasure obj-size-102080-0-10 2026-03-08T22:55:10.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T22:55:10.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T22:55:10.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=obj-size-102080-0-10 2026-03-08T22:55:10.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:55:10.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T22:55:10.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:55:10.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T22:55:10.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:55:10.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T22:55:10.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:55:10.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T22:55:10.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put obj-size-102080-0-10 td/test-erasure-eio/ORIGINAL 2026-03-08T22:55:10.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:236: rados_get_data_bad_size: set_size obj-size-102080-0-10 td/test-erasure-eio 0 10 set 2026-03-08T22:55:10.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:192: set_size: local objname=obj-size-102080-0-10 2026-03-08T22:55:10.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:193: set_size: shift 2026-03-08T22:55:10.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:194: set_size: local dir=td/test-erasure-eio 2026-03-08T22:55:10.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:195: set_size: shift 2026-03-08T22:55:10.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:196: set_size: local shard_id=0 2026-03-08T22:55:10.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:197: set_size: shift 2026-03-08T22:55:10.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:198: set_size: local bytes=10 2026-03-08T22:55:10.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:199: set_size: shift 2026-03-08T22:55:10.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:200: set_size: local mode=set 2026-03-08T22:55:10.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:202: set_size: local poolname=pool-jerasure 2026-03-08T22:55:10.688 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: get_osds pool-jerasure obj-size-102080-0-10 2026-03-08T22:55:10.689 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:55:10.689 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-size-102080-0-10 2026-03-08T22:55:10.689 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-size-102080-0-10 2026-03-08T22:55:10.689 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:55:10.922 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:55:10.922 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:55:10.923 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T22:55:10.923 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T22:55:10.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: initial_osds=('3' '1' '2') 2026-03-08T22:55:10.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: local -a initial_osds 2026-03-08T22:55:10.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:204: set_size: local osd_id=3 2026-03-08T22:55:10.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:205: set_size: ceph osd set noout 2026-03-08T22:55:11.216 INFO:tasks.workunit.client.0.vm04.stderr:noout is set 2026-03-08T22:55:11.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:206: set_size: '[' set = add ']' 2026-03-08T22:55:11.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:210: set_size: '[' 10 = 0 ']' 2026-03-08T22:55:11.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:214: set_size: dd if=/dev/urandom bs=10 count=1 of=td/test-erasure-eio/CORRUPT 2026-03-08T22:55:11.229 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records in 2026-03-08T22:55:11.229 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records out 2026-03-08T22:55:11.229 INFO:tasks.workunit.client.0.vm04.stderr:10 bytes copied, 7.4459e-05 s, 134 kB/s 2026-03-08T22:55:11.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:216: set_size: objectstore_tool td/test-erasure-eio 3 obj-size-102080-0-10 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:55:11.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-eio 2026-03-08T22:55:11.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:55:11.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T22:55:11.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:55:11.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-eio 3 obj-size-102080-0-10 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:55:11.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-eio 2026-03-08T22:55:11.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:55:11.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T22:55:11.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:55:11.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-eio TERM osd.3 2026-03-08T22:55:11.230 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:55:11.230 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:55:11.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:55:11.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:55:11.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:55:11.337 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:55:11.337 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-eio 3 obj-size-102080-0-10 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:55:11.337 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-eio 2026-03-08T22:55:11.337 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:55:11.337 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T22:55:11.337 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:55:11.337 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-eio/3 2026-03-08T22:55:11.337 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-eio/3 obj-size-102080-0-10 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:55:12.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-eio 3 2026-03-08T22:55:12.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T22:55:12.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:55:12.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T22:55:12.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:55:12.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/3 2026-03-08T22:55:12.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:55:12.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:55:12.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:55:12.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:55:12.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/3' 2026-03-08T22:55:12.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/3/journal' 2026-03-08T22:55:12.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:55:12.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:55:12.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:55:12.451 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:55:12.451 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:12.451 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:12.451 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:12.451 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:12.451 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:55:12.451 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:55:12.451 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:55:12.451 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:55:12.451 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:55:12.451 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:55:12.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:55:12.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:55:12.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:55:12.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:55:12.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:55:12.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:55:12.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/3 2026-03-08T22:55:12.453 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T22:55:12.453 INFO:tasks.workunit.client.0.vm04.stderr:start osd.3 2026-03-08T22:55:12.453 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:55:12.453 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/3/whoami 2026-03-08T22:55:12.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T22:55:12.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:55:12.455 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:55:12.457 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:55:12.470 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:12.471+0000 7f1b87c82780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:12.477 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:12.478+0000 7f1b87c82780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:12.477 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:12.478+0000 7f1b87c82780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:12.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T22:55:12.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:55:12.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:55:12.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:55:12.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:55:12.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:12.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:55:12.690 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:55:12.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:12.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:55:12.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:13.806 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:13.807+0000 7f1b87c82780 -1 Falling back to public interface 2026-03-08T22:55:13.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:13.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:13.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:55:13.932 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:55:13.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:13.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:55:14.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:14.672 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:14.673+0000 7f1b87c82780 -1 osd.3 35 log_to_monitors true 2026-03-08T22:55:15.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:15.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:15.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:55:15.178 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:55:15.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:15.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:55:15.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:16.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:16.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:16.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:55:16.429 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:55:16.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:16.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:55:16.660 INFO:tasks.workunit.client.0.vm04.stderr:osd.3 up in weight 1 up_from 41 up_thru 41 down_at 38 last_clean_interval [27,35) [v2:127.0.0.1:6826/3162595716,v1:127.0.0.1:6827/3162595716] [v2:127.0.0.1:6828/3162595716,v1:127.0.0.1:6829/3162595716] exists,up a0701ab8-3ad4-4bcd-b0e4-fd484a7aa23c 2026-03-08T22:55:16.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:55:16.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:55:16.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:55:16.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:55:16.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:55:16.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:55:16.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:55:16.661 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:55:16.661 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:55:16.661 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:55:16.661 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:55:16.661 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:55:16.661 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:55:16.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:55:16.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:55:16.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:55:16.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:55:16.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:55:16.739 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:55:16.978 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:55:16.978 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:55:16.978 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:55:16.978 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:55:16.978 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:55:16.978 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:16.979 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:55:17.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803788 2026-03-08T22:55:17.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803788 2026-03-08T22:55:17.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803788' 2026-03-08T22:55:17.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:17.058 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:55:17.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574859 2026-03-08T22:55:17.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574859 2026-03-08T22:55:17.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803788 1-55834574859' 2026-03-08T22:55:17.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:17.131 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:55:17.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378633 2026-03-08T22:55:17.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378633 2026-03-08T22:55:17.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803788 1-55834574859 2-81604378633' 2026-03-08T22:55:17.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:17.216 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:55:17.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=176093659139 2026-03-08T22:55:17.295 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 176093659139 2026-03-08T22:55:17.295 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803788 1-55834574859 2-81604378633 3-176093659139' 2026-03-08T22:55:17.295 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:17.295 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803788 2026-03-08T22:55:17.295 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:17.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:55:17.297 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803788 2026-03-08T22:55:17.297 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:17.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803788 2026-03-08T22:55:17.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803788' 2026-03-08T22:55:17.298 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 25769803788 2026-03-08T22:55:17.298 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:17.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803786 -lt 25769803788 2026-03-08T22:55:17.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:55:18.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:55:18.520 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:18.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803788 -lt 25769803788 2026-03-08T22:55:18.753 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:18.753 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574859 2026-03-08T22:55:18.753 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:18.754 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:55:18.754 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574859 2026-03-08T22:55:18.755 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:18.756 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574859 2026-03-08T22:55:18.756 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574859' 2026-03-08T22:55:18.756 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 55834574859 2026-03-08T22:55:18.756 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:55:18.991 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574859 -lt 55834574859 2026-03-08T22:55:18.991 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:18.992 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378633 2026-03-08T22:55:18.992 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:18.993 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:55:18.994 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378633 2026-03-08T22:55:18.994 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:18.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378633 2026-03-08T22:55:18.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378633' 2026-03-08T22:55:18.995 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 81604378633 2026-03-08T22:55:18.995 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:55:19.223 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378633 -lt 81604378633 2026-03-08T22:55:19.223 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:19.224 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-176093659139 2026-03-08T22:55:19.224 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:19.225 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:55:19.226 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-176093659139 2026-03-08T22:55:19.226 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:19.227 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=176093659139 2026-03-08T22:55:19.227 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 176093659139' 2026-03-08T22:55:19.227 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 176093659139 2026-03-08T22:55:19.228 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:55:19.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 176093659139 -lt 176093659139 2026-03-08T22:55:19.454 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:55:19.455 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:19.455 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:19.766 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:55:19.766 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:55:19.766 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:55:19.766 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:55:19.767 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:55:19.767 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:55:19.767 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:55:19.767 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:55:19.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:55:19.999 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:55:19.999 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:19.999 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:20.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:55:20.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:55:20.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:55:20.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:217: set_size: rm -f td/test-erasure-eio/CORRUPT 2026-03-08T22:55:20.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:218: set_size: ceph osd unset noout 2026-03-08T22:55:20.641 INFO:tasks.workunit.client.0.vm04.stderr:noout is unset 2026-03-08T22:55:20.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:238: rados_get_data_bad_size: rados_get td/test-erasure-eio pool-jerasure obj-size-102080-0-10 2026-03-08T22:55:20.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:55:20.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:55:20.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=obj-size-102080-0-10 2026-03-08T22:55:20.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=ok 2026-03-08T22:55:20.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' ok = fail ']' 2026-03-08T22:55:20.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:120: rados_get: rados --pool pool-jerasure get obj-size-102080-0-10 td/test-erasure-eio/COPY 2026-03-08T22:55:20.684 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:121: rados_get: diff td/test-erasure-eio/ORIGINAL td/test-erasure-eio/COPY 2026-03-08T22:55:20.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:122: rados_get: rm td/test-erasure-eio/COPY 2026-03-08T22:55:20.688 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:241: rados_get_data_bad_size: expr 0 + 1 2026-03-08T22:55:20.689 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:241: rados_get_data_bad_size: shard_id=1 2026-03-08T22:55:20.689 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:242: rados_get_data_bad_size: set_size obj-size-102080-0-10 td/test-erasure-eio 1 10 set 2026-03-08T22:55:20.689 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:192: set_size: local objname=obj-size-102080-0-10 2026-03-08T22:55:20.689 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:193: set_size: shift 2026-03-08T22:55:20.689 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:194: set_size: local dir=td/test-erasure-eio 2026-03-08T22:55:20.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:195: set_size: shift 2026-03-08T22:55:20.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:196: set_size: local shard_id=1 2026-03-08T22:55:20.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:197: set_size: shift 2026-03-08T22:55:20.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:198: set_size: local bytes=10 2026-03-08T22:55:20.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:199: set_size: shift 2026-03-08T22:55:20.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:200: set_size: local mode=set 2026-03-08T22:55:20.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:202: set_size: local poolname=pool-jerasure 2026-03-08T22:55:20.690 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: get_osds pool-jerasure obj-size-102080-0-10 2026-03-08T22:55:20.690 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:55:20.690 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-size-102080-0-10 2026-03-08T22:55:20.690 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-size-102080-0-10 2026-03-08T22:55:20.690 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:55:20.929 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:55:20.929 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:55:20.929 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T22:55:20.929 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T22:55:20.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: initial_osds=('3' '1' '2') 2026-03-08T22:55:20.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: local -a initial_osds 2026-03-08T22:55:20.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:204: set_size: local osd_id=1 2026-03-08T22:55:20.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:205: set_size: ceph osd set noout 2026-03-08T22:55:21.346 INFO:tasks.workunit.client.0.vm04.stderr:noout is set 2026-03-08T22:55:21.360 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:206: set_size: '[' set = add ']' 2026-03-08T22:55:21.360 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:210: set_size: '[' 10 = 0 ']' 2026-03-08T22:55:21.360 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:214: set_size: dd if=/dev/urandom bs=10 count=1 of=td/test-erasure-eio/CORRUPT 2026-03-08T22:55:21.360 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records in 2026-03-08T22:55:21.360 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records out 2026-03-08T22:55:21.360 INFO:tasks.workunit.client.0.vm04.stderr:10 bytes copied, 7.2024e-05 s, 139 kB/s 2026-03-08T22:55:21.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:216: set_size: objectstore_tool td/test-erasure-eio 1 obj-size-102080-0-10 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:55:21.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-eio 2026-03-08T22:55:21.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:55:21.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:55:21.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:55:21.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-eio 1 obj-size-102080-0-10 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:55:21.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-eio 2026-03-08T22:55:21.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:55:21.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:55:21.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:55:21.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-eio TERM osd.1 2026-03-08T22:55:21.361 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:55:21.361 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:55:21.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:55:21.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:55:21.361 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:55:21.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:55:21.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-eio 1 obj-size-102080-0-10 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:55:21.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-eio 2026-03-08T22:55:21.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:55:21.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:55:21.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:55:21.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-eio/1 2026-03-08T22:55:21.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-eio/1 obj-size-102080-0-10 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:55:22.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-eio 1 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:55:22.823 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:55:22.824 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:55:22.824 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:55:22.824 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:55:22.824 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:55:22.824 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:55:22.824 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:55:22.824 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:55:22.824 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:55:22.824 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:55:22.824 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:55:22.824 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T22:55:22.824 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:55:22.825 INFO:tasks.workunit.client.0.vm04.stderr:start osd.1 2026-03-08T22:55:22.825 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:55:22.825 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/1/whoami 2026-03-08T22:55:22.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:55:22.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:55:22.827 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:55:22.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:55:22.843 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:22.844+0000 7f775919c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:22.848 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:22.849+0000 7f775919c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:22.849 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:22.850+0000 7f775919c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:23.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:55:23.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:55:23.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:55:23.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:55:23.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:55:23.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:23.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:55:23.058 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:55:23.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:23.058 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:23.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:23.927 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:23.928+0000 7f775919c780 -1 Falling back to public interface 2026-03-08T22:55:24.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:24.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:24.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:55:24.286 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:55:24.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:24.288 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:24.525 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:24.772 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:24.773+0000 7f775919c780 -1 osd.1 46 log_to_monitors true 2026-03-08T22:55:25.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:25.528 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:25.528 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:55:25.528 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:55:25.528 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:25.528 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:25.802 INFO:tasks.workunit.client.0.vm04.stderr:osd.1 up in weight 1 up_from 50 up_thru 38 down_at 47 last_clean_interval [13,46) [v2:127.0.0.1:6810/3273234328,v1:127.0.0.1:6811/3273234328] [v2:127.0.0.1:6812/3273234328,v1:127.0.0.1:6813/3273234328] exists,up d0f972bc-7f7b-4a97-a6fb-5b9afda8f701 2026-03-08T22:55:25.802 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:55:25.802 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:55:25.802 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:55:25.802 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:55:25.802 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:55:25.802 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:55:25.802 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:55:25.803 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:55:25.803 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:55:25.803 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:55:25.803 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:55:25.803 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:55:25.803 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:55:25.894 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:55:25.894 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:55:25.894 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:55:25.894 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:55:25.894 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:55:25.894 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:55:26.134 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:55:26.134 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:55:26.134 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:55:26.134 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:55:26.134 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:55:26.134 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:26.135 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:55:26.212 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803792 2026-03-08T22:55:26.213 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803792 2026-03-08T22:55:26.213 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803792' 2026-03-08T22:55:26.213 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:26.213 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:55:26.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=214748364803 2026-03-08T22:55:26.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 214748364803 2026-03-08T22:55:26.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803792 1-214748364803' 2026-03-08T22:55:26.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:26.291 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:55:26.367 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378637 2026-03-08T22:55:26.367 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378637 2026-03-08T22:55:26.367 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803792 1-214748364803 2-81604378637' 2026-03-08T22:55:26.367 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:26.367 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:55:26.444 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=176093659143 2026-03-08T22:55:26.444 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 176093659143 2026-03-08T22:55:26.444 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803792 1-214748364803 2-81604378637 3-176093659143' 2026-03-08T22:55:26.444 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:26.444 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803792 2026-03-08T22:55:26.444 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:26.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:55:26.446 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803792 2026-03-08T22:55:26.446 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:26.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803792 2026-03-08T22:55:26.447 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803792' 2026-03-08T22:55:26.447 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 25769803792 2026-03-08T22:55:26.447 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:26.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803790 -lt 25769803792 2026-03-08T22:55:26.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:55:27.670 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:55:27.670 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:27.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803790 -lt 25769803792 2026-03-08T22:55:27.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:55:28.905 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:55:28.905 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:29.134 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803792 -lt 25769803792 2026-03-08T22:55:29.134 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:29.135 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-214748364803 2026-03-08T22:55:29.135 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:29.136 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:55:29.137 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-214748364803 2026-03-08T22:55:29.137 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:29.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=214748364803 2026-03-08T22:55:29.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 214748364803' 2026-03-08T22:55:29.138 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 214748364803 2026-03-08T22:55:29.139 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:55:29.367 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 214748364803 -lt 214748364803 2026-03-08T22:55:29.367 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:29.368 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378637 2026-03-08T22:55:29.368 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:29.369 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:55:29.370 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378637 2026-03-08T22:55:29.370 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:29.371 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378637 2026-03-08T22:55:29.371 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378637' 2026-03-08T22:55:29.371 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 81604378637 2026-03-08T22:55:29.371 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:55:29.612 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378637 -lt 81604378637 2026-03-08T22:55:29.612 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:29.612 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-176093659143 2026-03-08T22:55:29.612 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:29.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:55:29.615 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-176093659143 2026-03-08T22:55:29.615 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:29.616 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=176093659143 2026-03-08T22:55:29.616 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 176093659143' 2026-03-08T22:55:29.616 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 176093659143 2026-03-08T22:55:29.616 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:55:29.860 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 176093659143 -lt 176093659143 2026-03-08T22:55:29.860 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:55:29.861 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:29.861 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:30.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:55:30.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:55:30.164 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:55:30.164 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:55:30.164 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:55:30.164 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:55:30.164 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:55:30.164 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:55:30.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:55:30.386 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:55:30.386 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:30.386 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:30.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:55:30.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:55:30.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:55:30.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:217: set_size: rm -f td/test-erasure-eio/CORRUPT 2026-03-08T22:55:30.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:218: set_size: ceph osd unset noout 2026-03-08T22:55:31.024 INFO:tasks.workunit.client.0.vm04.stderr:noout is unset 2026-03-08T22:55:31.037 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:243: rados_get_data_bad_size: rados_get td/test-erasure-eio pool-jerasure obj-size-102080-0-10 fail 2026-03-08T22:55:31.037 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:55:31.037 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:55:31.037 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=obj-size-102080-0-10 2026-03-08T22:55:31.037 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=fail 2026-03-08T22:55:31.037 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' fail = fail ']' 2026-03-08T22:55:31.037 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:114: rados_get: rados --pool pool-jerasure get obj-size-102080-0-10 td/test-erasure-eio/COPY 2026-03-08T22:55:31.062 INFO:tasks.workunit.client.0.vm04.stderr:error getting pool-jerasure/obj-size-102080-0-10: (5) Input/output error 2026-03-08T22:55:31.063 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:115: rados_get: return 2026-03-08T22:55:31.064 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:244: rados_get_data_bad_size: rm td/test-erasure-eio/ORIGINAL 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:310: TEST_rados_get_bad_size_shard_0: rados_get_data_bad_size td/test-erasure-eio 0 0 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:222: rados_get_data_bad_size: local dir=td/test-erasure-eio 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:223: rados_get_data_bad_size: shift 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:224: rados_get_data_bad_size: local shard_id=0 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:225: rados_get_data_bad_size: shift 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:226: rados_get_data_bad_size: local bytes=0 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:227: rados_get_data_bad_size: shift 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:228: rados_get_data_bad_size: local mode=set 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:230: rados_get_data_bad_size: local poolname=pool-jerasure 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:231: rados_get_data_bad_size: local objname=obj-size-102080-0-0 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:232: rados_get_data_bad_size: rados_put td/test-erasure-eio pool-jerasure obj-size-102080-0-0 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=obj-size-102080-0-0 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T22:55:31.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put obj-size-102080-0-0 td/test-erasure-eio/ORIGINAL 2026-03-08T22:55:31.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:236: rados_get_data_bad_size: set_size obj-size-102080-0-0 td/test-erasure-eio 0 0 set 2026-03-08T22:55:31.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:192: set_size: local objname=obj-size-102080-0-0 2026-03-08T22:55:31.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:193: set_size: shift 2026-03-08T22:55:31.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:194: set_size: local dir=td/test-erasure-eio 2026-03-08T22:55:31.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:195: set_size: shift 2026-03-08T22:55:31.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:196: set_size: local shard_id=0 2026-03-08T22:55:31.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:197: set_size: shift 2026-03-08T22:55:31.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:198: set_size: local bytes=0 2026-03-08T22:55:31.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:199: set_size: shift 2026-03-08T22:55:31.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:200: set_size: local mode=set 2026-03-08T22:55:31.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:202: set_size: local poolname=pool-jerasure 2026-03-08T22:55:31.093 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: get_osds pool-jerasure obj-size-102080-0-0 2026-03-08T22:55:31.093 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:55:31.093 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-size-102080-0-0 2026-03-08T22:55:31.093 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-size-102080-0-0 2026-03-08T22:55:31.093 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:55:31.332 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:55:31.332 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:55:31.332 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T22:55:31.332 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T22:55:31.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: initial_osds=('3' '1' '2') 2026-03-08T22:55:31.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: local -a initial_osds 2026-03-08T22:55:31.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:204: set_size: local osd_id=3 2026-03-08T22:55:31.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:205: set_size: ceph osd set noout 2026-03-08T22:55:31.616 INFO:tasks.workunit.client.0.vm04.stderr:noout is set 2026-03-08T22:55:31.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:206: set_size: '[' set = add ']' 2026-03-08T22:55:31.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:210: set_size: '[' 0 = 0 ']' 2026-03-08T22:55:31.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:212: set_size: touch td/test-erasure-eio/CORRUPT 2026-03-08T22:55:31.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:216: set_size: objectstore_tool td/test-erasure-eio 3 obj-size-102080-0-0 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:55:31.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-eio 2026-03-08T22:55:31.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:55:31.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T22:55:31.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:55:31.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-eio 3 obj-size-102080-0-0 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:55:31.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-eio 2026-03-08T22:55:31.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:55:31.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T22:55:31.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:55:31.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-eio TERM osd.3 2026-03-08T22:55:31.631 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:55:31.631 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:55:31.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:55:31.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:55:31.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:55:31.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:55:31.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-eio 3 obj-size-102080-0-0 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:55:31.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-eio 2026-03-08T22:55:31.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:55:31.737 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T22:55:31.737 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:55:31.737 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-eio/3 2026-03-08T22:55:31.737 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-eio/3 obj-size-102080-0-0 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:55:32.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-eio 3 2026-03-08T22:55:32.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T22:55:32.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:55:32.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T22:55:32.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:55:32.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/3 2026-03-08T22:55:32.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:55:32.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:55:32.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:55:32.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:55:32.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/3' 2026-03-08T22:55:32.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/3/journal' 2026-03-08T22:55:32.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:55:32.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:55:32.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:55:32.848 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:55:32.848 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:32.848 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:32.848 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:32.848 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:32.848 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:55:32.848 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:55:32.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:55:32.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:55:32.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:55:32.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:55:32.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:55:32.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:55:32.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:55:32.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:55:32.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:55:32.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:55:32.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/3 2026-03-08T22:55:32.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T22:55:32.850 INFO:tasks.workunit.client.0.vm04.stderr:start osd.3 2026-03-08T22:55:32.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:55:32.850 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/3/whoami 2026-03-08T22:55:32.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T22:55:32.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:55:32.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:55:32.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:55:32.870 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:32.871+0000 7f5340eba780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:32.873 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:32.874+0000 7f5340eba780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:32.876 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:32.875+0000 7f5340eba780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:33.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T22:55:33.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:55:33.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:55:33.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:55:33.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:55:33.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:33.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:55:33.138 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:55:33.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:33.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:55:33.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:33.438 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:33.439+0000 7f5340eba780 -1 Falling back to public interface 2026-03-08T22:55:34.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:34.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:34.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:55:34.419 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:55:34.420 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:34.420 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:55:34.574 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:34.575+0000 7f5340eba780 -1 osd.3 54 log_to_monitors true 2026-03-08T22:55:34.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:35.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:35.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:35.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:55:35.668 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:55:35.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:35.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:55:35.923 INFO:tasks.workunit.client.0.vm04.stderr:osd.3 up in weight 1 up_from 59 up_thru 59 down_at 56 last_clean_interval [41,54) [v2:127.0.0.1:6826/3107028546,v1:127.0.0.1:6827/3107028546] [v2:127.0.0.1:6828/3107028546,v1:127.0.0.1:6829/3107028546] exists,up a0701ab8-3ad4-4bcd-b0e4-fd484a7aa23c 2026-03-08T22:55:35.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:55:35.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:55:35.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:55:35.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:55:35.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:55:35.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:55:35.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:55:35.924 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:55:35.924 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:55:35.924 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:55:35.925 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:55:35.925 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:55:35.925 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:55:36.016 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:55:36.016 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:55:36.016 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:55:36.016 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:55:36.016 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:55:36.016 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:55:36.253 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:55:36.253 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:55:36.253 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:55:36.253 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:55:36.253 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:55:36.253 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:36.253 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:55:36.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803796 2026-03-08T22:55:36.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803796 2026-03-08T22:55:36.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803796' 2026-03-08T22:55:36.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:36.354 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:55:36.444 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=214748364807 2026-03-08T22:55:36.444 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 214748364807 2026-03-08T22:55:36.444 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803796 1-214748364807' 2026-03-08T22:55:36.444 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:36.444 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:55:36.523 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378641 2026-03-08T22:55:36.523 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378641 2026-03-08T22:55:36.523 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803796 1-214748364807 2-81604378641' 2026-03-08T22:55:36.523 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:36.523 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:55:36.598 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=253403070467 2026-03-08T22:55:36.598 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 253403070467 2026-03-08T22:55:36.598 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803796 1-214748364807 2-81604378641 3-253403070467' 2026-03-08T22:55:36.598 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:36.598 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803796 2026-03-08T22:55:36.598 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:36.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:55:36.600 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803796 2026-03-08T22:55:36.600 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:36.601 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803796 2026-03-08T22:55:36.601 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803796' 2026-03-08T22:55:36.601 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 25769803796 2026-03-08T22:55:36.601 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:36.837 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803794 -lt 25769803796 2026-03-08T22:55:36.837 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:55:37.838 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:55:37.838 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:38.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803794 -lt 25769803796 2026-03-08T22:55:38.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:55:39.088 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:55:39.089 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:39.312 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803796 -lt 25769803796 2026-03-08T22:55:39.312 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:39.313 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-214748364807 2026-03-08T22:55:39.313 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:39.314 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:55:39.314 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-214748364807 2026-03-08T22:55:39.315 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:39.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=214748364807 2026-03-08T22:55:39.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 214748364807' 2026-03-08T22:55:39.316 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 214748364807 2026-03-08T22:55:39.316 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:55:39.541 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 214748364807 -lt 214748364807 2026-03-08T22:55:39.541 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:39.542 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378641 2026-03-08T22:55:39.542 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:39.543 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:55:39.544 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378641 2026-03-08T22:55:39.544 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:39.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378641 2026-03-08T22:55:39.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378641' 2026-03-08T22:55:39.545 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 81604378641 2026-03-08T22:55:39.545 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:55:39.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378641 -lt 81604378641 2026-03-08T22:55:39.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:39.790 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-253403070467 2026-03-08T22:55:39.790 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:39.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:55:39.792 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-253403070467 2026-03-08T22:55:39.792 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:39.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=253403070467 2026-03-08T22:55:39.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 253403070467' 2026-03-08T22:55:39.794 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 253403070467 2026-03-08T22:55:39.794 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:55:40.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 253403070467 -lt 253403070467 2026-03-08T22:55:40.032 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:55:40.032 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:40.032 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:40.341 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:55:40.341 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:55:40.342 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:55:40.342 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:55:40.342 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:55:40.342 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:55:40.342 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:55:40.342 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:55:40.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:55:40.583 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:55:40.583 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:40.583 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:40.880 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:55:40.880 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:55:40.880 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:55:40.880 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:217: set_size: rm -f td/test-erasure-eio/CORRUPT 2026-03-08T22:55:40.881 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:218: set_size: ceph osd unset noout 2026-03-08T22:55:41.223 INFO:tasks.workunit.client.0.vm04.stderr:noout is unset 2026-03-08T22:55:41.238 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:238: rados_get_data_bad_size: rados_get td/test-erasure-eio pool-jerasure obj-size-102080-0-0 2026-03-08T22:55:41.238 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:55:41.238 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:55:41.238 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=obj-size-102080-0-0 2026-03-08T22:55:41.238 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=ok 2026-03-08T22:55:41.238 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' ok = fail ']' 2026-03-08T22:55:41.238 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:120: rados_get: rados --pool pool-jerasure get obj-size-102080-0-0 td/test-erasure-eio/COPY 2026-03-08T22:55:41.266 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:121: rados_get: diff td/test-erasure-eio/ORIGINAL td/test-erasure-eio/COPY 2026-03-08T22:55:41.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:122: rados_get: rm td/test-erasure-eio/COPY 2026-03-08T22:55:41.269 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:241: rados_get_data_bad_size: expr 0 + 1 2026-03-08T22:55:41.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:241: rados_get_data_bad_size: shard_id=1 2026-03-08T22:55:41.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:242: rados_get_data_bad_size: set_size obj-size-102080-0-0 td/test-erasure-eio 1 0 set 2026-03-08T22:55:41.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:192: set_size: local objname=obj-size-102080-0-0 2026-03-08T22:55:41.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:193: set_size: shift 2026-03-08T22:55:41.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:194: set_size: local dir=td/test-erasure-eio 2026-03-08T22:55:41.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:195: set_size: shift 2026-03-08T22:55:41.270 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:196: set_size: local shard_id=1 2026-03-08T22:55:41.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:197: set_size: shift 2026-03-08T22:55:41.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:198: set_size: local bytes=0 2026-03-08T22:55:41.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:199: set_size: shift 2026-03-08T22:55:41.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:200: set_size: local mode=set 2026-03-08T22:55:41.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:202: set_size: local poolname=pool-jerasure 2026-03-08T22:55:41.271 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: get_osds pool-jerasure obj-size-102080-0-0 2026-03-08T22:55:41.271 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:55:41.271 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-size-102080-0-0 2026-03-08T22:55:41.272 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-size-102080-0-0 2026-03-08T22:55:41.272 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:55:41.503 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:55:41.503 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:55:41.503 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T22:55:41.503 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T22:55:41.503 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: initial_osds=('3' '1' '2') 2026-03-08T22:55:41.503 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: local -a initial_osds 2026-03-08T22:55:41.503 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:204: set_size: local osd_id=1 2026-03-08T22:55:41.503 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:205: set_size: ceph osd set noout 2026-03-08T22:55:41.800 INFO:tasks.workunit.client.0.vm04.stderr:noout is set 2026-03-08T22:55:41.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:206: set_size: '[' set = add ']' 2026-03-08T22:55:41.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:210: set_size: '[' 0 = 0 ']' 2026-03-08T22:55:41.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:212: set_size: touch td/test-erasure-eio/CORRUPT 2026-03-08T22:55:41.814 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:216: set_size: objectstore_tool td/test-erasure-eio 1 obj-size-102080-0-0 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:55:41.814 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-eio 2026-03-08T22:55:41.814 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:55:41.814 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:55:41.814 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:55:41.814 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-eio 1 obj-size-102080-0-0 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:55:41.814 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-eio 2026-03-08T22:55:41.814 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:55:41.814 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:55:41.814 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:55:41.814 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-eio TERM osd.1 2026-03-08T22:55:41.815 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:55:41.815 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:55:41.815 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:55:41.815 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:55:41.815 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:55:42.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:55:42.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-eio 1 obj-size-102080-0-0 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:55:42.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-eio 2026-03-08T22:55:42.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:55:42.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:55:42.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:55:42.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-eio/1 2026-03-08T22:55:42.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-eio/1 obj-size-102080-0-0 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:55:43.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-eio 1 2026-03-08T22:55:43.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T22:55:43.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:55:43.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:55:43.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:55:43.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T22:55:43.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:55:43.243 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:55:43.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:55:43.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:55:43.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T22:55:43.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T22:55:43.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:55:43.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:55:43.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:55:43.244 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:55:43.244 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:43.244 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:43.244 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:43.244 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:43.244 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:55:43.244 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:55:43.245 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:55:43.245 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:55:43.245 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:55:43.245 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:55:43.245 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:55:43.245 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:55:43.245 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:55:43.245 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:55:43.245 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:55:43.245 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:55:43.245 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T22:55:43.245 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:55:43.246 INFO:tasks.workunit.client.0.vm04.stderr:start osd.1 2026-03-08T22:55:43.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:55:43.246 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/1/whoami 2026-03-08T22:55:43.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:55:43.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:55:43.248 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:55:43.249 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:55:43.264 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:43.265+0000 7fa389ba1780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:43.271 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:43.272+0000 7fa389ba1780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:43.272 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:43.273+0000 7fa389ba1780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:43.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:55:43.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:55:43.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:55:43.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:55:43.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:55:43.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:43.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:55:43.469 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:55:43.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:43.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:43.694 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:43.830 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:43.830+0000 7fa389ba1780 -1 Falling back to public interface 2026-03-08T22:55:44.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:44.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:44.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:55:44.696 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:55:44.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:44.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:44.938 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:45.182 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:45.183+0000 7fa389ba1780 -1 osd.1 62 log_to_monitors true 2026-03-08T22:55:45.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:45.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:45.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:55:45.941 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:55:45.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:45.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:46.214 INFO:tasks.workunit.client.0.vm04.stderr:osd.1 up in weight 1 up_from 68 up_thru 56 down_at 65 last_clean_interval [50,62) [v2:127.0.0.1:6810/435219702,v1:127.0.0.1:6811/435219702] [v2:127.0.0.1:6812/435219702,v1:127.0.0.1:6813/435219702] exists,up d0f972bc-7f7b-4a97-a6fb-5b9afda8f701 2026-03-08T22:55:46.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:55:46.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:55:46.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:55:46.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:55:46.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:55:46.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:55:46.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:55:46.214 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:55:46.215 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:55:46.215 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:55:46.215 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:55:46.215 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:55:46.215 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:55:46.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:55:46.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:55:46.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:55:46.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:55:46.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:55:46.309 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:55:46.532 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:55:46.533 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:55:46.533 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:55:46.533 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:55:46.533 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:55:46.533 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:46.533 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:55:46.618 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803800 2026-03-08T22:55:46.618 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803800 2026-03-08T22:55:46.618 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803800' 2026-03-08T22:55:46.618 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:46.619 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:55:46.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776131 2026-03-08T22:55:46.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776131 2026-03-08T22:55:46.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803800 1-292057776131' 2026-03-08T22:55:46.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:46.697 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:55:46.772 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378645 2026-03-08T22:55:46.772 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378645 2026-03-08T22:55:46.772 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803800 1-292057776131 2-81604378645' 2026-03-08T22:55:46.772 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:46.773 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:55:46.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=253403070471 2026-03-08T22:55:46.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 253403070471 2026-03-08T22:55:46.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803800 1-292057776131 2-81604378645 3-253403070471' 2026-03-08T22:55:46.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:46.854 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803800 2026-03-08T22:55:46.854 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:46.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:55:46.856 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803800 2026-03-08T22:55:46.856 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:46.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803800 2026-03-08T22:55:46.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803800' 2026-03-08T22:55:46.857 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 25769803800 2026-03-08T22:55:46.857 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:47.090 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803798 -lt 25769803800 2026-03-08T22:55:47.091 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:55:48.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:55:48.092 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:48.322 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803800 -lt 25769803800 2026-03-08T22:55:48.322 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:48.323 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-292057776131 2026-03-08T22:55:48.323 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:48.324 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:55:48.325 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-292057776131 2026-03-08T22:55:48.325 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:48.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776131 2026-03-08T22:55:48.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 292057776131' 2026-03-08T22:55:48.326 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 292057776131 2026-03-08T22:55:48.326 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:55:48.550 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776131 -lt 292057776131 2026-03-08T22:55:48.550 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:48.551 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378645 2026-03-08T22:55:48.551 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:48.552 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:55:48.553 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378645 2026-03-08T22:55:48.553 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:48.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378645 2026-03-08T22:55:48.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378645' 2026-03-08T22:55:48.554 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 81604378645 2026-03-08T22:55:48.554 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:55:48.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378645 -lt 81604378645 2026-03-08T22:55:48.778 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:48.779 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-253403070471 2026-03-08T22:55:48.779 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:48.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:55:48.781 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-253403070471 2026-03-08T22:55:48.781 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:48.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=253403070471 2026-03-08T22:55:48.782 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 253403070471' 2026-03-08T22:55:48.782 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 253403070471 2026-03-08T22:55:48.782 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:55:49.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 253403070471 -lt 253403070471 2026-03-08T22:55:49.007 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:55:49.008 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:49.008 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:49.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:55:49.311 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:55:49.311 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:55:49.311 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:55:49.311 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:55:49.311 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:55:49.311 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:55:49.311 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:55:49.533 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:55:49.533 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:55:49.534 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:49.534 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:49.827 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:55:49.827 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:55:49.827 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:55:49.827 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:217: set_size: rm -f td/test-erasure-eio/CORRUPT 2026-03-08T22:55:49.827 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:218: set_size: ceph osd unset noout 2026-03-08T22:55:50.113 INFO:tasks.workunit.client.0.vm04.stderr:noout is unset 2026-03-08T22:55:50.125 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:243: rados_get_data_bad_size: rados_get td/test-erasure-eio pool-jerasure obj-size-102080-0-0 fail 2026-03-08T22:55:50.125 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:55:50.125 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:55:50.125 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=obj-size-102080-0-0 2026-03-08T22:55:50.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=fail 2026-03-08T22:55:50.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' fail = fail ']' 2026-03-08T22:55:50.126 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:114: rados_get: rados --pool pool-jerasure get obj-size-102080-0-0 td/test-erasure-eio/COPY 2026-03-08T22:55:50.150 INFO:tasks.workunit.client.0.vm04.stderr:error getting pool-jerasure/obj-size-102080-0-0: (5) Input/output error 2026-03-08T22:55:50.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:115: rados_get: return 2026-03-08T22:55:50.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:244: rados_get_data_bad_size: rm td/test-erasure-eio/ORIGINAL 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:311: TEST_rados_get_bad_size_shard_0: rados_get_data_bad_size td/test-erasure-eio 0 256 add 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:222: rados_get_data_bad_size: local dir=td/test-erasure-eio 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:223: rados_get_data_bad_size: shift 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:224: rados_get_data_bad_size: local shard_id=0 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:225: rados_get_data_bad_size: shift 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:226: rados_get_data_bad_size: local bytes=256 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:227: rados_get_data_bad_size: shift 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:228: rados_get_data_bad_size: local mode=add 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:230: rados_get_data_bad_size: local poolname=pool-jerasure 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:231: rados_get_data_bad_size: local objname=obj-size-102080-0-256 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:232: rados_get_data_bad_size: rados_put td/test-erasure-eio pool-jerasure obj-size-102080-0-256 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=obj-size-102080-0-256 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T22:55:50.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put obj-size-102080-0-256 td/test-erasure-eio/ORIGINAL 2026-03-08T22:55:50.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:236: rados_get_data_bad_size: set_size obj-size-102080-0-256 td/test-erasure-eio 0 256 add 2026-03-08T22:55:50.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:192: set_size: local objname=obj-size-102080-0-256 2026-03-08T22:55:50.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:193: set_size: shift 2026-03-08T22:55:50.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:194: set_size: local dir=td/test-erasure-eio 2026-03-08T22:55:50.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:195: set_size: shift 2026-03-08T22:55:50.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:196: set_size: local shard_id=0 2026-03-08T22:55:50.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:197: set_size: shift 2026-03-08T22:55:50.187 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:198: set_size: local bytes=256 2026-03-08T22:55:50.187 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:199: set_size: shift 2026-03-08T22:55:50.187 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:200: set_size: local mode=add 2026-03-08T22:55:50.187 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:202: set_size: local poolname=pool-jerasure 2026-03-08T22:55:50.187 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: get_osds pool-jerasure obj-size-102080-0-256 2026-03-08T22:55:50.187 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:55:50.187 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-size-102080-0-256 2026-03-08T22:55:50.187 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-size-102080-0-256 2026-03-08T22:55:50.187 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:55:50.412 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:55:50.412 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:55:50.412 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T22:55:50.412 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T22:55:50.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: initial_osds=('3' '1' '2') 2026-03-08T22:55:50.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: local -a initial_osds 2026-03-08T22:55:50.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:204: set_size: local osd_id=3 2026-03-08T22:55:50.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:205: set_size: ceph osd set noout 2026-03-08T22:55:50.700 INFO:tasks.workunit.client.0.vm04.stderr:noout is set 2026-03-08T22:55:50.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:206: set_size: '[' add = add ']' 2026-03-08T22:55:50.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:208: set_size: objectstore_tool td/test-erasure-eio 3 obj-size-102080-0-256 get-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:55:50.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-eio 2026-03-08T22:55:50.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:55:50.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T22:55:50.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:55:50.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-eio 3 obj-size-102080-0-256 get-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:55:50.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-eio 2026-03-08T22:55:50.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:55:50.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T22:55:50.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:55:50.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-eio TERM osd.3 2026-03-08T22:55:50.712 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:55:50.712 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:55:50.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:55:50.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:55:50.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:55:50.819 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:55:50.819 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-eio 3 obj-size-102080-0-256 get-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:55:50.819 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-eio 2026-03-08T22:55:50.819 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:55:50.819 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T22:55:50.819 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:55:50.819 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-eio/3 2026-03-08T22:55:50.819 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-eio/3 obj-size-102080-0-256 get-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-eio 3 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/3 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/3' 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/3/journal' 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:55:51.403 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:55:51.404 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:55:51.404 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:55:51.404 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:55:51.404 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:55:51.404 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:55:51.404 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:55:51.404 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:55:51.404 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:55:51.404 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:55:51.404 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:55:51.404 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/3 2026-03-08T22:55:51.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T22:55:51.405 INFO:tasks.workunit.client.0.vm04.stderr:start osd.3 2026-03-08T22:55:51.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:55:51.405 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/3/whoami 2026-03-08T22:55:51.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T22:55:51.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:55:51.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:55:51.409 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:55:51.423 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:51.424+0000 7f894eeed780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:51.429 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:51.430+0000 7f894eeed780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:51.430 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:51.431+0000 7f894eeed780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:51.644 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T22:55:51.644 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:55:51.644 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:55:51.644 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:55:51.644 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:55:51.644 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:51.644 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:55:51.644 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:55:51.645 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:51.645 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:55:51.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:52.501 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:52.502+0000 7f894eeed780 -1 Falling back to public interface 2026-03-08T22:55:52.896 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:52.896 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:52.896 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:55:52.897 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:55:52.897 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:52.897 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:55:53.134 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:53.376 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:55:53.377+0000 7f894eeed780 -1 osd.3 71 log_to_monitors true 2026-03-08T22:55:54.136 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:54.136 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:54.136 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:55:54.136 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:55:54.136 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:54.136 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:55:54.369 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:55.371 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:55.371 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:55.371 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:55:55.371 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:55:55.371 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:55.371 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:55:55.604 INFO:tasks.workunit.client.0.vm04.stderr:osd.3 up in weight 1 up_from 77 up_thru 77 down_at 74 last_clean_interval [59,71) [v2:127.0.0.1:6826/2577608945,v1:127.0.0.1:6827/2577608945] [v2:127.0.0.1:6828/2577608945,v1:127.0.0.1:6829/2577608945] exists,up a0701ab8-3ad4-4bcd-b0e4-fd484a7aa23c 2026-03-08T22:55:55.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:55:55.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:55:55.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:55:55.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:55:55.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:55:55.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:55:55.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:55:55.605 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:55:55.605 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:55:55.605 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:55:55.605 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:55:55.605 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:55:55.605 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:55:55.689 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:55:55.689 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:55:55.689 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:55:55.689 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:55:55.689 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:55:55.689 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:55:55.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:55:55.919 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:55:55.919 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:55:55.919 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:55:55.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:55:55.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:55.919 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:55:55.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803804 2026-03-08T22:55:55.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803804 2026-03-08T22:55:55.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803804' 2026-03-08T22:55:55.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:55.999 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:55:56.081 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776135 2026-03-08T22:55:56.081 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776135 2026-03-08T22:55:56.081 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803804 1-292057776135' 2026-03-08T22:55:56.081 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:56.082 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:55:56.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378649 2026-03-08T22:55:56.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378649 2026-03-08T22:55:56.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803804 1-292057776135 2-81604378649' 2026-03-08T22:55:56.155 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:56.156 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:55:56.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=330712481795 2026-03-08T22:55:56.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 330712481795 2026-03-08T22:55:56.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803804 1-292057776135 2-81604378649 3-330712481795' 2026-03-08T22:55:56.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:56.231 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803804 2026-03-08T22:55:56.231 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:56.232 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:55:56.233 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803804 2026-03-08T22:55:56.233 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:56.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803804 2026-03-08T22:55:56.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803804' 2026-03-08T22:55:56.234 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 25769803804 2026-03-08T22:55:56.234 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:56.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803804 -lt 25769803804 2026-03-08T22:55:56.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:56.454 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-292057776135 2026-03-08T22:55:56.455 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:56.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:55:56.456 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-292057776135 2026-03-08T22:55:56.457 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:56.458 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776135 2026-03-08T22:55:56.458 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 292057776135' 2026-03-08T22:55:56.458 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 292057776135 2026-03-08T22:55:56.458 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:55:56.686 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776133 -lt 292057776135 2026-03-08T22:55:56.686 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:55:57.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:55:57.688 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:55:57.913 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776133 -lt 292057776135 2026-03-08T22:55:57.913 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:55:58.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:55:58.915 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:55:59.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776135 -lt 292057776135 2026-03-08T22:55:59.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:59.162 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378649 2026-03-08T22:55:59.162 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:59.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:55:59.164 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378649 2026-03-08T22:55:59.164 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:59.165 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378649 2026-03-08T22:55:59.165 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378649' 2026-03-08T22:55:59.165 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 81604378649 2026-03-08T22:55:59.166 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:55:59.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378649 -lt 81604378649 2026-03-08T22:55:59.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:59.393 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-330712481795 2026-03-08T22:55:59.394 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:59.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:55:59.395 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-330712481795 2026-03-08T22:55:59.395 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:59.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=330712481795 2026-03-08T22:55:59.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 330712481795' 2026-03-08T22:55:59.396 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 330712481795 2026-03-08T22:55:59.397 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:55:59.622 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 330712481795 -lt 330712481795 2026-03-08T22:55:59.622 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:55:59.623 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:59.623 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:59.926 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:55:59.927 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:55:59.927 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:55:59.927 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:55:59.927 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:55:59.927 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:55:59.927 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:55:59.927 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:56:00.147 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:56:00.148 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:56:00.148 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:56:00.148 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:56:00.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:56:00.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:56:00.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:56:00.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:209: set_size: dd if=/dev/urandom bs=256 count=1 2026-03-08T22:56:00.448 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records in 2026-03-08T22:56:00.449 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records out 2026-03-08T22:56:00.449 INFO:tasks.workunit.client.0.vm04.stderr:256 bytes copied, 6.973e-05 s, 3.7 MB/s 2026-03-08T22:56:00.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:216: set_size: objectstore_tool td/test-erasure-eio 3 obj-size-102080-0-256 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:56:00.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-eio 2026-03-08T22:56:00.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:56:00.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T22:56:00.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:56:00.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-eio 3 obj-size-102080-0-256 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:56:00.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-eio 2026-03-08T22:56:00.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:56:00.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T22:56:00.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:56:00.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-eio TERM osd.3 2026-03-08T22:56:00.449 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:56:00.449 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:56:00.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:56:00.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:56:00.449 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:56:00.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:56:00.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-eio 3 obj-size-102080-0-256 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:56:00.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-eio 2026-03-08T22:56:00.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:56:00.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T22:56:00.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:56:00.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-eio/3 2026-03-08T22:56:00.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-eio/3 obj-size-102080-0-256 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:56:01.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-eio 3 2026-03-08T22:56:01.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T22:56:01.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:56:01.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T22:56:01.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:56:01.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/3 2026-03-08T22:56:01.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:56:01.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:56:01.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:56:01.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:56:01.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/3' 2026-03-08T22:56:01.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/3/journal' 2026-03-08T22:56:01.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:56:01.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:56:01.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:56:01.721 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:56:01.721 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:01.721 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:01.721 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:01.721 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:01.721 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:56:01.721 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:56:01.721 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:56:01.721 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:56:01.721 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:56:01.721 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:56:01.721 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:56:01.721 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:56:01.721 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:56:01.721 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:56:01.721 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:56:01.721 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:56:01.721 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/3 2026-03-08T22:56:01.722 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T22:56:01.722 INFO:tasks.workunit.client.0.vm04.stderr:start osd.3 2026-03-08T22:56:01.723 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:56:01.723 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/3/whoami 2026-03-08T22:56:01.723 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T22:56:01.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:56:01.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:56:01.726 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:56:01.741 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:01.742+0000 7f5b6642e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:01.748 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:01.749+0000 7f5b6642e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:01.749 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:01.750+0000 7f5b6642e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:01.958 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T22:56:01.958 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:56:01.958 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:56:01.958 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:56:01.958 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:56:01.958 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:01.958 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:56:01.958 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:56:01.958 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:01.959 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:56:02.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:02.307 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:02.308+0000 7f5b6642e780 -1 Falling back to public interface 2026-03-08T22:56:03.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:03.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:03.186 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:56:03.187 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:56:03.187 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:03.189 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:56:03.199 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:03.201+0000 7f5b6642e780 -1 osd.3 79 log_to_monitors true 2026-03-08T22:56:03.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:04.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:04.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:04.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:56:04.429 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:56:04.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:04.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:56:04.445 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:04.446+0000 7f5b5d3c6640 -1 osd.3 79 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:56:04.678 INFO:tasks.workunit.client.0.vm04.stderr:osd.3 up in weight 1 up_from 83 up_thru 83 down_at 80 last_clean_interval [77,79) [v2:127.0.0.1:6826/4037419910,v1:127.0.0.1:6827/4037419910] [v2:127.0.0.1:6828/4037419910,v1:127.0.0.1:6829/4037419910] exists,up a0701ab8-3ad4-4bcd-b0e4-fd484a7aa23c 2026-03-08T22:56:04.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:56:04.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:56:04.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:56:04.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:56:04.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:56:04.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:56:04.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:56:04.679 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:56:04.679 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:56:04.679 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:56:04.679 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:56:04.679 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:56:04.679 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:56:04.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:56:04.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:56:04.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:56:04.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:56:04.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:56:04.768 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:56:04.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:56:04.994 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:56:04.994 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:56:04.994 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:56:04.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:56:04.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:04.995 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:56:05.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803807 2026-03-08T22:56:05.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803807 2026-03-08T22:56:05.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803807' 2026-03-08T22:56:05.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:05.070 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:56:05.152 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776138 2026-03-08T22:56:05.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776138 2026-03-08T22:56:05.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803807 1-292057776138' 2026-03-08T22:56:05.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:05.153 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:56:05.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378652 2026-03-08T22:56:05.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378652 2026-03-08T22:56:05.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803807 1-292057776138 2-81604378652' 2026-03-08T22:56:05.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:05.231 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:56:05.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=356482285571 2026-03-08T22:56:05.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 356482285571 2026-03-08T22:56:05.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803807 1-292057776138 2-81604378652 3-356482285571' 2026-03-08T22:56:05.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:05.306 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803807 2026-03-08T22:56:05.306 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:05.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:56:05.308 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803807 2026-03-08T22:56:05.308 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:05.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803807 2026-03-08T22:56:05.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803807' 2026-03-08T22:56:05.309 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 25769803807 2026-03-08T22:56:05.309 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:56:05.533 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803805 -lt 25769803807 2026-03-08T22:56:05.534 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:56:06.534 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:56:06.535 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:56:06.764 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803808 -lt 25769803807 2026-03-08T22:56:06.764 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:06.764 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-292057776138 2026-03-08T22:56:06.764 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:06.765 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:56:06.766 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-292057776138 2026-03-08T22:56:06.766 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:06.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776138 2026-03-08T22:56:06.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 292057776138' 2026-03-08T22:56:06.767 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 292057776138 2026-03-08T22:56:06.767 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:56:06.987 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776139 -lt 292057776138 2026-03-08T22:56:06.987 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:06.988 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378652 2026-03-08T22:56:06.988 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:06.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:56:06.990 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378652 2026-03-08T22:56:06.990 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:06.991 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378652 2026-03-08T22:56:06.991 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378652' 2026-03-08T22:56:06.991 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 81604378652 2026-03-08T22:56:06.991 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:56:07.225 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378653 -lt 81604378652 2026-03-08T22:56:07.225 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:07.225 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-356482285571 2026-03-08T22:56:07.226 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:07.227 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:56:07.227 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-356482285571 2026-03-08T22:56:07.227 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:07.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=356482285571 2026-03-08T22:56:07.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 356482285571' 2026-03-08T22:56:07.228 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 356482285571 2026-03-08T22:56:07.228 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:56:07.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 356482285571 -lt 356482285571 2026-03-08T22:56:07.461 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:56:07.462 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:56:07.462 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:56:07.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:56:07.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:56:07.773 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:56:07.773 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:56:07.773 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:56:07.773 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:56:07.774 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:56:07.774 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:56:07.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=3 2026-03-08T22:56:07.998 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:56:07.998 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:56:07.998 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:56:08.291 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 3 = 5 2026-03-08T22:56:08.291 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 3 '!=' -1 2026-03-08T22:56:08.291 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1674: wait_for_clean: loop=0 2026-03-08T22:56:08.291 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1675: wait_for_clean: num_active_clean=3 2026-03-08T22:56:08.291 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:56:08.291 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:56:08.392 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:56:08.392 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:56:08.392 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:56:08.392 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:56:08.392 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:56:08.392 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:56:08.392 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:56:08.393 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:56:08.606 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=3 2026-03-08T22:56:08.606 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:56:08.606 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:56:08.606 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:56:08.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 3 = 5 2026-03-08T22:56:08.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 3 '!=' 3 2026-03-08T22:56:08.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:56:08.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:56:08.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:56:08.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:56:08.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:56:08.904 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:56:08.904 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:56:09.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=null 2026-03-08T22:56:09.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test null '!=' null 2026-03-08T22:56:09.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1678: wait_for_clean: (( 1 >= 13 )) 2026-03-08T22:56:09.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:56:09.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.2 2026-03-08T22:56:09.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:56:09.445 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:56:09.446 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:56:09.446 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:56:09.446 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:56:09.446 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:56:09.446 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:56:09.446 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:56:09.679 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=3 2026-03-08T22:56:09.679 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:56:09.680 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:56:09.680 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:56:09.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 3 = 5 2026-03-08T22:56:09.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 3 '!=' 3 2026-03-08T22:56:09.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:56:09.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:56:09.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:56:09.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:56:09.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:56:09.978 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:56:09.978 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:56:10.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=null 2026-03-08T22:56:10.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test null '!=' null 2026-03-08T22:56:10.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1678: wait_for_clean: (( 2 >= 13 )) 2026-03-08T22:56:10.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:56:10.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.4 2026-03-08T22:56:10.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:56:10.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:56:10.697 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:56:10.697 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:56:10.697 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:56:10.697 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:56:10.698 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:56:10.698 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:56:10.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:56:10.924 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:56:10.924 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:56:10.924 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:56:11.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:56:11.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:56:11.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:56:11.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:217: set_size: rm -f td/test-erasure-eio/CORRUPT 2026-03-08T22:56:11.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:218: set_size: ceph osd unset noout 2026-03-08T22:56:11.569 INFO:tasks.workunit.client.0.vm04.stderr:noout is unset 2026-03-08T22:56:11.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:238: rados_get_data_bad_size: rados_get td/test-erasure-eio pool-jerasure obj-size-102080-0-256 2026-03-08T22:56:11.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:56:11.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:56:11.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=obj-size-102080-0-256 2026-03-08T22:56:11.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=ok 2026-03-08T22:56:11.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' ok = fail ']' 2026-03-08T22:56:11.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:120: rados_get: rados --pool pool-jerasure get obj-size-102080-0-256 td/test-erasure-eio/COPY 2026-03-08T22:56:11.611 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:121: rados_get: diff td/test-erasure-eio/ORIGINAL td/test-erasure-eio/COPY 2026-03-08T22:56:11.612 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:122: rados_get: rm td/test-erasure-eio/COPY 2026-03-08T22:56:11.613 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:241: rados_get_data_bad_size: expr 0 + 1 2026-03-08T22:56:11.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:241: rados_get_data_bad_size: shard_id=1 2026-03-08T22:56:11.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:242: rados_get_data_bad_size: set_size obj-size-102080-0-256 td/test-erasure-eio 1 256 add 2026-03-08T22:56:11.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:192: set_size: local objname=obj-size-102080-0-256 2026-03-08T22:56:11.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:193: set_size: shift 2026-03-08T22:56:11.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:194: set_size: local dir=td/test-erasure-eio 2026-03-08T22:56:11.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:195: set_size: shift 2026-03-08T22:56:11.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:196: set_size: local shard_id=1 2026-03-08T22:56:11.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:197: set_size: shift 2026-03-08T22:56:11.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:198: set_size: local bytes=256 2026-03-08T22:56:11.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:199: set_size: shift 2026-03-08T22:56:11.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:200: set_size: local mode=add 2026-03-08T22:56:11.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:202: set_size: local poolname=pool-jerasure 2026-03-08T22:56:11.615 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: get_osds pool-jerasure obj-size-102080-0-256 2026-03-08T22:56:11.615 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:56:11.615 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-size-102080-0-256 2026-03-08T22:56:11.615 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-size-102080-0-256 2026-03-08T22:56:11.615 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:56:11.870 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:56:11.870 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:56:11.870 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T22:56:11.870 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T22:56:11.870 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: initial_osds=('3' '1' '2') 2026-03-08T22:56:11.870 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: local -a initial_osds 2026-03-08T22:56:11.870 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:204: set_size: local osd_id=1 2026-03-08T22:56:11.870 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:205: set_size: ceph osd set noout 2026-03-08T22:56:12.146 INFO:tasks.workunit.client.0.vm04.stderr:noout is set 2026-03-08T22:56:12.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:206: set_size: '[' add = add ']' 2026-03-08T22:56:12.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:208: set_size: objectstore_tool td/test-erasure-eio 1 obj-size-102080-0-256 get-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:56:12.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-eio 2026-03-08T22:56:12.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:56:12.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:56:12.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:56:12.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-eio 1 obj-size-102080-0-256 get-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:56:12.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-eio 2026-03-08T22:56:12.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:56:12.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:56:12.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:56:12.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-eio TERM osd.1 2026-03-08T22:56:12.159 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:56:12.159 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:56:12.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:56:12.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:56:12.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:56:12.265 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:56:12.265 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-eio 1 obj-size-102080-0-256 get-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:56:12.266 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-eio 2026-03-08T22:56:12.266 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:56:12.266 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:56:12.266 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:56:12.266 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-eio/1 2026-03-08T22:56:12.266 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-eio/1 obj-size-102080-0-256 get-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:56:12.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-eio 1 2026-03-08T22:56:12.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T22:56:12.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:56:12.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:56:12.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:56:12.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T22:56:12.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:56:12.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:56:12.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:56:12.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:56:12.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T22:56:12.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T22:56:12.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:56:12.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:56:12.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:56:12.854 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:56:12.854 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:12.854 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:12.855 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:12.855 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:12.855 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:56:12.855 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:56:12.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:56:12.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:56:12.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:56:12.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:56:12.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:56:12.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:56:12.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:56:12.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:56:12.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:56:12.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:56:12.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T22:56:12.856 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:56:12.856 INFO:tasks.workunit.client.0.vm04.stderr:start osd.1 2026-03-08T22:56:12.856 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:56:12.857 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/1/whoami 2026-03-08T22:56:12.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:56:12.858 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:56:12.859 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:56:12.860 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:56:12.876 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:12.876+0000 7f75f45f9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:12.884 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:12.885+0000 7f75f45f9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:12.885 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:12.886+0000 7f75f45f9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:13.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:56:13.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:56:13.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:56:13.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:56:13.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:56:13.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:13.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:56:13.094 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:56:13.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:13.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:13.323 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:13.459 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:13.460+0000 7f75f45f9780 -1 Falling back to public interface 2026-03-08T22:56:14.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:14.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:14.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:56:14.326 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:56:14.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:14.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:14.370 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:14.371+0000 7f75f45f9780 -1 osd.1 87 log_to_monitors true 2026-03-08T22:56:14.571 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:15.443 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:15.443+0000 7f75eb591640 -1 osd.1 87 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:56:15.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:15.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:15.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:56:15.574 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:56:15.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:15.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:15.815 INFO:tasks.workunit.client.0.vm04.stderr:osd.1 up in weight 1 up_from 92 up_thru 92 down_at 89 last_clean_interval [68,87) [v2:127.0.0.1:6810/3111072899,v1:127.0.0.1:6811/3111072899] [v2:127.0.0.1:6812/3111072899,v1:127.0.0.1:6813/3111072899] exists,up d0f972bc-7f7b-4a97-a6fb-5b9afda8f701 2026-03-08T22:56:15.816 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:56:15.816 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:56:15.816 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:56:15.816 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:56:15.816 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:56:15.816 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:56:15.816 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:56:15.816 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:56:15.816 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:56:15.816 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:56:15.817 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:56:15.817 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:56:15.817 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:56:15.912 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:56:15.912 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:56:15.912 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:56:15.912 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:56:15.912 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:56:15.912 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:56:16.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:56:16.164 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:56:16.164 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:56:16.164 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:56:16.164 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:56:16.164 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:16.164 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:56:16.249 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803812 2026-03-08T22:56:16.249 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803812 2026-03-08T22:56:16.249 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803812' 2026-03-08T22:56:16.249 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:16.250 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:56:16.331 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=395136991235 2026-03-08T22:56:16.331 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 395136991235 2026-03-08T22:56:16.331 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803812 1-395136991235' 2026-03-08T22:56:16.331 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:16.331 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:56:16.425 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378657 2026-03-08T22:56:16.425 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378657 2026-03-08T22:56:16.425 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803812 1-395136991235 2-81604378657' 2026-03-08T22:56:16.425 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:16.425 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:56:16.505 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=356482285575 2026-03-08T22:56:16.505 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 356482285575 2026-03-08T22:56:16.505 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803812 1-395136991235 2-81604378657 3-356482285575' 2026-03-08T22:56:16.505 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:16.506 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803812 2026-03-08T22:56:16.506 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:16.506 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:56:16.507 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803812 2026-03-08T22:56:16.507 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:16.508 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803812 2026-03-08T22:56:16.508 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803812' 2026-03-08T22:56:16.508 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 25769803812 2026-03-08T22:56:16.508 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:56:16.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803810 -lt 25769803812 2026-03-08T22:56:16.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:56:17.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:56:17.750 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:56:17.979 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803810 -lt 25769803812 2026-03-08T22:56:17.979 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:56:18.980 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:56:18.981 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:56:19.225 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803812 -lt 25769803812 2026-03-08T22:56:19.225 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:19.226 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-395136991235 2026-03-08T22:56:19.226 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:19.227 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:56:19.227 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-395136991235 2026-03-08T22:56:19.227 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:19.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=395136991235 2026-03-08T22:56:19.228 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 395136991235' 2026-03-08T22:56:19.229 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 395136991235 2026-03-08T22:56:19.229 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:56:19.483 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 395136991235 -lt 395136991235 2026-03-08T22:56:19.483 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:19.484 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378657 2026-03-08T22:56:19.487 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:19.487 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:56:19.487 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378657 2026-03-08T22:56:19.488 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:19.488 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378657 2026-03-08T22:56:19.488 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378657' 2026-03-08T22:56:19.488 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 81604378657 2026-03-08T22:56:19.488 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:56:19.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378657 -lt 81604378657 2026-03-08T22:56:19.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:19.748 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-356482285575 2026-03-08T22:56:19.748 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:19.749 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:56:19.749 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-356482285575 2026-03-08T22:56:19.750 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:19.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=356482285575 2026-03-08T22:56:19.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 356482285575' 2026-03-08T22:56:19.750 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 356482285575 2026-03-08T22:56:19.751 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:56:19.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 356482285575 -lt 356482285575 2026-03-08T22:56:19.995 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:56:19.995 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:56:19.995 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:56:20.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:56:20.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:56:20.327 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:56:20.327 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:56:20.327 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:56:20.327 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:56:20.328 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:56:20.328 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:56:20.567 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:56:20.567 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:56:20.567 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:56:20.567 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:56:20.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:56:20.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:56:20.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:56:20.880 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:209: set_size: dd if=/dev/urandom bs=256 count=1 2026-03-08T22:56:20.881 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records in 2026-03-08T22:56:20.881 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records out 2026-03-08T22:56:20.881 INFO:tasks.workunit.client.0.vm04.stderr:256 bytes copied, 0.000119524 s, 2.1 MB/s 2026-03-08T22:56:20.881 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:216: set_size: objectstore_tool td/test-erasure-eio 1 obj-size-102080-0-256 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:56:20.881 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-eio 2026-03-08T22:56:20.881 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:56:20.881 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:56:20.881 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:56:20.881 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-eio 1 obj-size-102080-0-256 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:56:20.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-eio 2026-03-08T22:56:20.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:56:20.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:56:20.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:56:20.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-eio TERM osd.1 2026-03-08T22:56:20.882 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:56:20.882 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:56:20.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:56:20.883 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:56:20.883 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:56:20.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:56:20.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-eio 1 obj-size-102080-0-256 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:56:20.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-eio 2026-03-08T22:56:20.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:56:20.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:56:20.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:56:20.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-eio/1 2026-03-08T22:56:20.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-eio/1 obj-size-102080-0-256 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:56:22.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-eio 1 2026-03-08T22:56:22.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T22:56:22.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:56:22.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:56:22.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:56:22.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T22:56:22.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:56:22.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:56:22.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:56:22.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:56:22.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T22:56:22.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T22:56:22.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:56:22.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:56:22.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:56:22.115 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:56:22.115 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:22.115 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:22.115 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:22.115 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:22.115 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:56:22.115 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:56:22.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:56:22.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:56:22.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:56:22.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:56:22.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:56:22.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:56:22.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:56:22.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:56:22.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:56:22.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:56:22.116 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T22:56:22.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:56:22.117 INFO:tasks.workunit.client.0.vm04.stderr:start osd.1 2026-03-08T22:56:22.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:56:22.117 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/1/whoami 2026-03-08T22:56:22.118 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:56:22.118 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:56:22.119 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:56:22.121 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:56:22.135 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:22.136+0000 7f1ff69a3780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:22.139 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:22.140+0000 7f1ff69a3780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:22.141 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:22.141+0000 7f1ff69a3780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:22.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:56:22.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:56:22.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:56:22.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:56:22.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:56:22.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:22.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:56:22.348 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:56:22.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:22.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:22.595 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:22.951 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:22.952+0000 7f1ff69a3780 -1 Falling back to public interface 2026-03-08T22:56:23.598 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:23.598 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:23.598 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:56:23.598 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:56:23.598 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:23.598 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:23.823 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:23.824+0000 7f1ff69a3780 -1 osd.1 93 log_to_monitors true 2026-03-08T22:56:23.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:24.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:24.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:24.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:56:24.831 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:56:24.831 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:24.831 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:25.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:25.432 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:25.433+0000 7f1fed939640 -1 osd.1 93 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:56:26.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:26.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:26.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:56:26.131 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:56:26.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:26.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:26.440 INFO:tasks.workunit.client.0.vm04.stderr:osd.1 up in weight 1 up_from 97 up_thru 97 down_at 94 last_clean_interval [92,93) [v2:127.0.0.1:6810/2904755786,v1:127.0.0.1:6811/2904755786] [v2:127.0.0.1:6812/2904755786,v1:127.0.0.1:6813/2904755786] exists,up d0f972bc-7f7b-4a97-a6fb-5b9afda8f701 2026-03-08T22:56:26.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:56:26.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:56:26.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:56:26.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:56:26.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:56:26.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:56:26.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:56:26.441 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:56:26.442 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:56:26.442 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:56:26.442 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:56:26.487 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:56:26.487 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:56:26.523 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:56:26.523 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:56:26.523 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:56:26.523 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:56:26.523 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:56:26.524 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:56:26.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:56:26.752 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:56:26.752 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:56:26.752 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:56:26.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:56:26.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:26.752 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:56:26.842 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803816 2026-03-08T22:56:26.842 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803816 2026-03-08T22:56:26.842 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803816' 2026-03-08T22:56:26.842 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:26.842 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:56:26.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=416611827715 2026-03-08T22:56:26.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 416611827715 2026-03-08T22:56:26.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803816 1-416611827715' 2026-03-08T22:56:26.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:26.924 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:56:27.004 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378661 2026-03-08T22:56:27.005 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378661 2026-03-08T22:56:27.005 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803816 1-416611827715 2-81604378661' 2026-03-08T22:56:27.005 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:27.005 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:56:27.081 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=356482285579 2026-03-08T22:56:27.081 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 356482285579 2026-03-08T22:56:27.081 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803816 1-416611827715 2-81604378661 3-356482285579' 2026-03-08T22:56:27.081 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:27.082 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803816 2026-03-08T22:56:27.082 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:27.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:56:27.084 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803816 2026-03-08T22:56:27.084 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:27.085 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803816 2026-03-08T22:56:27.085 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803816' 2026-03-08T22:56:27.085 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 25769803816 2026-03-08T22:56:27.086 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:56:27.371 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803814 -lt 25769803816 2026-03-08T22:56:27.371 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:56:28.373 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:56:28.373 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:56:28.601 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803816 -lt 25769803816 2026-03-08T22:56:28.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:28.602 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-416611827715 2026-03-08T22:56:28.602 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:28.603 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:56:28.603 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-416611827715 2026-03-08T22:56:28.603 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:28.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=416611827715 2026-03-08T22:56:28.604 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 416611827715' 2026-03-08T22:56:28.604 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 416611827715 2026-03-08T22:56:28.605 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:56:28.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 416611827715 -lt 416611827715 2026-03-08T22:56:28.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:28.836 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378661 2026-03-08T22:56:28.836 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:28.837 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:56:28.838 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378661 2026-03-08T22:56:28.838 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:28.839 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378661 2026-03-08T22:56:28.839 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378661' 2026-03-08T22:56:28.839 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 81604378661 2026-03-08T22:56:28.840 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:56:29.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378661 -lt 81604378661 2026-03-08T22:56:29.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:29.117 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-356482285579 2026-03-08T22:56:29.117 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:29.118 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:56:29.119 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-356482285579 2026-03-08T22:56:29.119 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:29.120 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=356482285579 2026-03-08T22:56:29.121 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 356482285579' 2026-03-08T22:56:29.121 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 356482285579 2026-03-08T22:56:29.121 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:56:29.368 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 356482285579 -lt 356482285579 2026-03-08T22:56:29.369 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:56:29.369 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:56:29.369 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:56:29.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:56:29.735 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:56:29.735 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:56:29.736 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:56:29.736 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:56:29.736 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:56:29.736 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:56:29.736 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:56:29.969 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:56:29.970 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:56:29.970 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:56:29.970 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:56:30.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:56:30.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:56:30.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:56:30.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:217: set_size: rm -f td/test-erasure-eio/CORRUPT 2026-03-08T22:56:30.307 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:218: set_size: ceph osd unset noout 2026-03-08T22:56:30.645 INFO:tasks.workunit.client.0.vm04.stderr:noout is unset 2026-03-08T22:56:30.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:243: rados_get_data_bad_size: rados_get td/test-erasure-eio pool-jerasure obj-size-102080-0-256 fail 2026-03-08T22:56:30.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:56:30.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:56:30.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=obj-size-102080-0-256 2026-03-08T22:56:30.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=fail 2026-03-08T22:56:30.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' fail = fail ']' 2026-03-08T22:56:30.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:114: rados_get: rados --pool pool-jerasure get obj-size-102080-0-256 td/test-erasure-eio/COPY 2026-03-08T22:56:30.678 INFO:tasks.workunit.client.0.vm04.stderr:error getting pool-jerasure/obj-size-102080-0-256: (5) Input/output error 2026-03-08T22:56:30.680 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:115: rados_get: return 2026-03-08T22:56:30.680 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:244: rados_get_data_bad_size: rm td/test-erasure-eio/ORIGINAL 2026-03-08T22:56:30.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:312: TEST_rados_get_bad_size_shard_0: delete_erasure_coded_pool pool-jerasure 2026-03-08T22:56:30.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:84: delete_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:56:30.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:85: delete_erasure_coded_pool: ceph osd pool delete pool-jerasure pool-jerasure --yes-i-really-really-mean-it 2026-03-08T22:56:30.899 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' does not exist 2026-03-08T22:56:30.911 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:86: delete_erasure_coded_pool: ceph osd erasure-code-profile rm myprofile 2026-03-08T22:56:31.190 INFO:tasks.workunit.client.0.vm04.stderr:erasure-code-profile myprofile does not exist 2026-03-08T22:56:31.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:42: run: teardown td/test-erasure-eio 2026-03-08T22:56:31.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:56:31.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:56:31.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:56:31.203 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:56:31.203 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:56:31.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:56:31.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:56:31.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:56:31.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:56:31.325 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:56:31.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:56:31.327 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:56:31.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:56:31.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:56:31.328 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:56:31.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:56:31.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:56:31.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:56:31.330 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:56:31.331 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:56:31.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:56:31.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:56:31.363 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:56:31.363 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:31.363 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:56:31.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:56:31.367 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:56:31.367 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:56:31.367 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:32: run: for func in $funcs 2026-03-08T22:56:31.367 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:33: run: setup td/test-erasure-eio 2026-03-08T22:56:31.367 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-eio 2026-03-08T22:56:31.367 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-eio 2026-03-08T22:56:31.367 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:56:31.367 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:56:31.367 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:56:31.368 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:56:31.368 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:56:31.368 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:56:31.368 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:56:31.368 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:56:31.369 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:56:31.370 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:56:31.370 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:56:31.371 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:56:31.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:56:31.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:56:31.372 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:56:31.373 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:56:31.373 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:56:31.373 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:56:31.374 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:56:31.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:56:31.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:56:31.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:56:31.376 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:56:31.377 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:31.377 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:56:31.377 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:56:31.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:56:31.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:56:31.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-eio 2026-03-08T22:56:31.379 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:56:31.379 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:31.379 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:56:31.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.102080 2026-03-08T22:56:31.380 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:56:31.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:56:31.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:56:31.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-eio 1' TERM HUP INT 2026-03-08T22:56:31.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:34: run: run_mon td/test-erasure-eio a 2026-03-08T22:56:31.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-eio 2026-03-08T22:56:31.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:56:31.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:56:31.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:56:31.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-eio/a 2026-03-08T22:56:31.381 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-eio/a --run-dir=td/test-erasure-eio 2026-03-08T22:56:31.408 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:56:31.408 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:31.408 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:31.408 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:31.408 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:31.408 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:56:31.408 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:56:31.408 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-eio/a '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-eio/log --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:56:31.441 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:56:31.443 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:56:31.443 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:56:31.443 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:56:31.443 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:56:31.443 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:56:31.444 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:56:31.444 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:56:31.444 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:56:31.444 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:31.444 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:56:31.444 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:56:31.444 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:56:31.444 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:56:31.444 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get fsid 2026-03-08T22:56:31.498 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:56:31.498 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:56:31.498 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:56:31.498 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:56:31.498 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:56:31.500 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:56:31.500 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:56:31.500 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:56:31.500 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:56:31.501 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:31.501 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:56:31.501 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:56:31.501 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:56:31.501 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get mon_host 2026-03-08T22:56:31.560 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:35: run: run_mgr td/test-erasure-eio x 2026-03-08T22:56:31.560 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-eio 2026-03-08T22:56:31.560 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:56:31.560 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:56:31.560 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:56:31.560 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-eio/x 2026-03-08T22:56:31.560 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:56:31.686 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:56:31.686 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:31.686 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:31.686 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:31.686 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:31.686 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:56:31.686 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:56:31.687 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:56:31.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-eio/x '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:56:31.709 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:36: run: create_pool rbd 4 2026-03-08T22:56:31.709 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:56:31.834 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd' already exists 2026-03-08T22:56:31.848 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:56:32.851 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: get_asok_path mon.a 2026-03-08T22:56:32.851 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:56:32.851 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:56:32.851 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:56:32.851 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:32.851 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:56:32.851 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:56:32.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: CEPH_ARGS= 2026-03-08T22:56:32.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-mon.a.asok log flush 2026-03-08T22:56:32.897 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:56:32.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:40: run: grep 'load: jerasure.*lrc' td/test-erasure-eio/mon.a.log 2026-03-08T22:56:32.904 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:56:31.432+0000 7f29d21d8d80 0 load: jerasure load: lrc 2026-03-08T22:56:32.905 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:41: run: TEST_rados_get_bad_size_shard_1 td/test-erasure-eio 2026-03-08T22:56:32.905 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:316: TEST_rados_get_bad_size_shard_1: local dir=td/test-erasure-eio 2026-03-08T22:56:32.905 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:317: TEST_rados_get_bad_size_shard_1: setup_osds 4 2026-03-08T22:56:32.905 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:47: setup_osds: local count=4 2026-03-08T22:56:32.905 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:48: setup_osds: shift 2026-03-08T22:56:32.905 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: expr 4 - 1 2026-03-08T22:56:32.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: seq 0 3 2026-03-08T22:56:32.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:56:32.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 0 2026-03-08T22:56:32.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:56:32.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:56:32.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:56:32.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:56:32.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/0 2026-03-08T22:56:32.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:56:32.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:56:32.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:56:32.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:56:32.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/0' 2026-03-08T22:56:32.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/0/journal' 2026-03-08T22:56:32.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:56:32.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:56:32.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:56:32.908 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:56:32.908 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:32.908 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:32.908 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:32.908 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:32.908 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:56:32.908 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:56:32.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:56:32.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:56:32.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:56:32.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:56:32.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:56:32.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:56:32.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:56:32.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:56:32.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:56:32.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:56:32.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:56:32.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:56:32.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/0 2026-03-08T22:56:32.910 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:56:32.911 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=be9f5cc8-018b-4c7a-889c-2e7f7f1ce9f6 2026-03-08T22:56:32.919 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 be9f5cc8-018b-4c7a-889c-2e7f7f1ce9f6 2026-03-08T22:56:32.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 be9f5cc8-018b-4c7a-889c-2e7f7f1ce9f6' 2026-03-08T22:56:32.919 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:56:32.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCg/q1pIxYxNxAA+6dH0IZHN014mTa8yq0Wuw== 2026-03-08T22:56:32.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCg/q1pIxYxNxAA+6dH0IZHN014mTa8yq0Wuw=="}' 2026-03-08T22:56:32.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new be9f5cc8-018b-4c7a-889c-2e7f7f1ce9f6 -i td/test-erasure-eio/0/new.json 2026-03-08T22:56:33.227 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:56:33.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/0/new.json 2026-03-08T22:56:33.255 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCg/q1pIxYxNxAA+6dH0IZHN014mTa8yq0Wuw== --osd-uuid be9f5cc8-018b-4c7a-889c-2e7f7f1ce9f6 2026-03-08T22:56:33.257 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:33.256+0000 7fefd5e2c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:33.264 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:33.265+0000 7fefd5e2c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:33.266 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:33.266+0000 7fefd5e2c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:33.266 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:33.266+0000 7fefd5e2c780 -1 bdev(0x5649f03ac800 td/test-erasure-eio/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:56:33.266 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:33.266+0000 7fefd5e2c780 -1 bluestore(td/test-erasure-eio/0) _read_fsid unparsable uuid 2026-03-08T22:56:35.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/0/keyring 2026-03-08T22:56:35.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:56:35.978 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:56:35.979 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:56:35.979 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:56:36.293 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:56:36.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:56:36.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:56:36.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:56:36.295 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:56:36.297 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:56:36.313 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:36.314+0000 7f993302c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:36.315 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:36.316+0000 7f993302c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:36.317 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:36.317+0000 7f993302c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:36.609 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:56:36.610 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:56:36.610 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:56:36.610 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:56:36.610 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:56:36.610 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:56:36.610 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:36.610 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:56:36.610 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:56:36.610 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:36.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:36.885 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:36.886+0000 7f993302c780 -1 Falling back to public interface 2026-03-08T22:56:37.756 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:37.757+0000 7f993302c780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:56:37.830 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:56:37.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:37.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:37.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:56:37.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:37.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:56:38.099 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:39.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:39.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:39.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:56:39.100 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:56:39.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:39.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:56:39.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:39.635 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:39.636+0000 7f992dded640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:56:40.348 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:56:40.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:40.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:40.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:56:40.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:40.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:56:40.576 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 6 up_thru 6 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/226074901,v1:127.0.0.1:6803/226074901] [v2:127.0.0.1:6804/226074901,v1:127.0.0.1:6805/226074901] exists,up be9f5cc8-018b-4c7a-889c-2e7f7f1ce9f6 2026-03-08T22:56:40.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:56:40.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:56:40.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:56:40.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:56:40.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 1 2026-03-08T22:56:40.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:56:40.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:56:40.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:56:40.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:56:40.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T22:56:40.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:56:40.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:56:40.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:56:40.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:56:40.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T22:56:40.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T22:56:40.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:56:40.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:56:40.577 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:56:40.578 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:56:40.578 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:40.578 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:40.578 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:40.578 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:40.578 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:56:40.578 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:56:40.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:56:40.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:56:40.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:56:40.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:56:40.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:56:40.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:56:40.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:56:40.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:56:40.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:56:40.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:56:40.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:56:40.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:56:40.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T22:56:40.580 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:56:40.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=0649b5ed-f2a2-46d8-890a-68ec9ad1519c 2026-03-08T22:56:40.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 0649b5ed-f2a2-46d8-890a-68ec9ad1519c' 2026-03-08T22:56:40.581 INFO:tasks.workunit.client.0.vm04.stdout:add osd1 0649b5ed-f2a2-46d8-890a-68ec9ad1519c 2026-03-08T22:56:40.581 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:56:40.595 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCo/q1pXvuMIxAAE9Kfji+sUFuQYgYJYJfamw== 2026-03-08T22:56:40.595 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCo/q1pXvuMIxAAE9Kfji+sUFuQYgYJYJfamw=="}' 2026-03-08T22:56:40.595 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 0649b5ed-f2a2-46d8-890a-68ec9ad1519c -i td/test-erasure-eio/1/new.json 2026-03-08T22:56:40.933 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:56:40.942 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/1/new.json 2026-03-08T22:56:40.943 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCo/q1pXvuMIxAAE9Kfji+sUFuQYgYJYJfamw== --osd-uuid 0649b5ed-f2a2-46d8-890a-68ec9ad1519c 2026-03-08T22:56:40.963 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:40.964+0000 7f5aae882780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:40.972 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:40.966+0000 7f5aae882780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:40.972 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:40.967+0000 7f5aae882780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:40.972 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:40.967+0000 7f5aae882780 -1 bdev(0x55ad5261fc00 td/test-erasure-eio/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:56:40.972 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:40.967+0000 7f5aae882780 -1 bluestore(td/test-erasure-eio/1) _read_fsid unparsable uuid 2026-03-08T22:56:43.607 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/1/keyring 2026-03-08T22:56:43.607 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:56:43.608 INFO:tasks.workunit.client.0.vm04.stdout:adding osd1 key to auth repository 2026-03-08T22:56:43.608 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:56:43.608 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:56:43.959 INFO:tasks.workunit.client.0.vm04.stdout:start osd.1 2026-03-08T22:56:43.959 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:56:43.959 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:56:43.959 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:56:43.960 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:56:43.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:56:43.976 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:43.977+0000 7f38a89b4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:43.982 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:43.984+0000 7f38a89b4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:43.984 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:43.985+0000 7f38a89b4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:44.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:56:44.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:56:44.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:56:44.299 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:56:44.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:56:44.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:56:44.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:44.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:56:44.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:44.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:44.518 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:44.553 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:44.554+0000 7f38a89b4780 -1 Falling back to public interface 2026-03-08T22:56:45.520 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:56:45.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:45.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:45.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:56:45.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:45.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:45.646 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:45.647+0000 7f38a89b4780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:56:45.735 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:46.736 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:56:46.737 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:46.737 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:46.737 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:56:46.738 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:46.738 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:46.970 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:47.973 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:56:47.974 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:47.974 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:47.974 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:56:47.974 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:47.974 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:48.086 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:48.087+0000 7f38a4155640 -1 osd.1 0 waiting for initial osdmap 2026-03-08T22:56:48.231 INFO:tasks.workunit.client.0.vm04.stdout:osd.1 up in weight 1 up_from 12 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/138188872,v1:127.0.0.1:6811/138188872] [v2:127.0.0.1:6812/138188872,v1:127.0.0.1:6813/138188872] exists,up 0649b5ed-f2a2-46d8-890a-68ec9ad1519c 2026-03-08T22:56:48.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:56:48.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:56:48.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:56:48.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:56:48.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 2 2026-03-08T22:56:48.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:56:48.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:56:48.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:56:48.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:56:48.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T22:56:48.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:56:48.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:56:48.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:56:48.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:56:48.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T22:56:48.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T22:56:48.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:56:48.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:56:48.239 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:56:48.241 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:56:48.241 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:48.241 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:48.241 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:48.241 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:48.241 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:56:48.243 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:56:48.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:56:48.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:56:48.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:56:48.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:56:48.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:56:48.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:56:48.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:56:48.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:56:48.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:56:48.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:56:48.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:56:48.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:56:48.244 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T22:56:48.245 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:56:48.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=ae882e4b-3141-437c-b90b-795070ff9b8d 2026-03-08T22:56:48.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 ae882e4b-3141-437c-b90b-795070ff9b8d' 2026-03-08T22:56:48.246 INFO:tasks.workunit.client.0.vm04.stdout:add osd2 ae882e4b-3141-437c-b90b-795070ff9b8d 2026-03-08T22:56:48.246 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:56:48.260 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCw/q1pAxGODxAAspgGZvEy+Jh82C/gueaO2g== 2026-03-08T22:56:48.260 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCw/q1pAxGODxAAspgGZvEy+Jh82C/gueaO2g=="}' 2026-03-08T22:56:48.260 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new ae882e4b-3141-437c-b90b-795070ff9b8d -i td/test-erasure-eio/2/new.json 2026-03-08T22:56:48.488 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:56:48.498 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/2/new.json 2026-03-08T22:56:48.498 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCw/q1pAxGODxAAspgGZvEy+Jh82C/gueaO2g== --osd-uuid ae882e4b-3141-437c-b90b-795070ff9b8d 2026-03-08T22:56:48.517 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:48.518+0000 7f9e45697780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:48.519 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:48.520+0000 7f9e45697780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:48.520 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:48.521+0000 7f9e45697780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:48.520 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:48.521+0000 7f9e45697780 -1 bdev(0x55be2f36fc00 td/test-erasure-eio/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:56:48.520 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:48.521+0000 7f9e45697780 -1 bluestore(td/test-erasure-eio/2) _read_fsid unparsable uuid 2026-03-08T22:56:50.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/2/keyring 2026-03-08T22:56:50.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:56:50.925 INFO:tasks.workunit.client.0.vm04.stdout:adding osd2 key to auth repository 2026-03-08T22:56:50.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:56:50.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:56:51.245 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:56:51.246 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T22:56:51.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:56:51.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:56:51.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:56:51.249 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:56:51.267 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:51.266+0000 7f8ba160c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:51.267 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:51.269+0000 7f8ba160c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:51.269 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:51.270+0000 7f8ba160c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:51.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:56:51.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:56:51.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:56:51.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:56:51.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:56:51.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:51.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:56:51.461 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:56:51.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:51.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:56:51.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:52.095 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:52.097+0000 7f8ba160c780 -1 Falling back to public interface 2026-03-08T22:56:52.691 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:52.691 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:52.691 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:56:52.691 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:56:52.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:52.693 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:56:52.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:52.965 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:52.966+0000 7f8ba160c780 -1 osd.2 0 log_to_monitors true 2026-03-08T22:56:53.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:53.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:53.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:56:53.925 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:56:53.926 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:53.926 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:56:54.174 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:55.175 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:56:55.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:55.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:55.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:56:55.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:55.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:56:55.395 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up in weight 1 up_from 18 up_thru 20 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/300267986,v1:127.0.0.1:6819/300267986] [v2:127.0.0.1:6820/300267986,v1:127.0.0.1:6821/300267986] exists,up ae882e4b-3141-437c-b90b-795070ff9b8d 2026-03-08T22:56:55.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:56:55.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:56:55.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:56:55.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:56:55.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 3 2026-03-08T22:56:55.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:56:55.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:56:55.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T22:56:55.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:56:55.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/3 2026-03-08T22:56:55.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:56:55.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:56:55.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:56:55.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:56:55.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/3' 2026-03-08T22:56:55.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/3/journal' 2026-03-08T22:56:55.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:56:55.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:56:55.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:56:55.396 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:56:55.396 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:55.396 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:55.396 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:55.396 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:55.396 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:56:55.396 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:56:55.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:56:55.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:56:55.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:56:55.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:56:55.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:56:55.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:56:55.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:56:55.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:56:55.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:56:55.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:56:55.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:56:55.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:56:55.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/3 2026-03-08T22:56:55.398 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:56:55.399 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=0760ca8d-6623-4c83-9449-3435282c03df 2026-03-08T22:56:55.399 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 0760ca8d-6623-4c83-9449-3435282c03df' 2026-03-08T22:56:55.399 INFO:tasks.workunit.client.0.vm04.stdout:add osd3 0760ca8d-6623-4c83-9449-3435282c03df 2026-03-08T22:56:55.399 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:56:55.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQC3/q1p2fOoGBAAYatLhzn0Hw6uG6wgzbRdhg== 2026-03-08T22:56:55.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQC3/q1p2fOoGBAAYatLhzn0Hw6uG6wgzbRdhg=="}' 2026-03-08T22:56:55.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 0760ca8d-6623-4c83-9449-3435282c03df -i td/test-erasure-eio/3/new.json 2026-03-08T22:56:55.635 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:56:55.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/3/new.json 2026-03-08T22:56:55.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQC3/q1p2fOoGBAAYatLhzn0Hw6uG6wgzbRdhg== --osd-uuid 0760ca8d-6623-4c83-9449-3435282c03df 2026-03-08T22:56:55.665 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:55.666+0000 7f95a480c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:55.667 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:55.669+0000 7f95a480c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:55.668 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:55.670+0000 7f95a480c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:55.669 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:55.670+0000 7f95a480c780 -1 bdev(0x5570d1fcbc00 td/test-erasure-eio/3/block) open stat got: (1) Operation not permitted 2026-03-08T22:56:55.669 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:55.670+0000 7f95a480c780 -1 bluestore(td/test-erasure-eio/3) _read_fsid unparsable uuid 2026-03-08T22:56:57.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/3/keyring 2026-03-08T22:56:57.785 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:56:57.786 INFO:tasks.workunit.client.0.vm04.stdout:adding osd3 key to auth repository 2026-03-08T22:56:57.786 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T22:56:57.786 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:56:58.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T22:56:58.082 INFO:tasks.workunit.client.0.vm04.stdout:start osd.3 2026-03-08T22:56:58.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:56:58.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:56:58.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:56:58.085 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:56:58.101 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:58.101+0000 7f4fcd13f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:58.102 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:58.103+0000 7f4fcd13f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:58.104 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:58.104+0000 7f4fcd13f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:58.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T22:56:58.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:56:58.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:56:58.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:56:58.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:56:58.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:58.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:56:58.310 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:56:58.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:58.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:56:58.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:59.426 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:56:59.427+0000 7f4fcd13f780 -1 Falling back to public interface 2026-03-08T22:56:59.530 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:59.530 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:59.530 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:56:59.530 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:56:59.530 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:59.531 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:56:59.745 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:00.288 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:00.289+0000 7f4fcd13f780 -1 osd.3 0 log_to_monitors true 2026-03-08T22:57:00.747 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:57:00.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:00.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:00.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:57:00.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:00.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:57:00.984 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:01.710 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:01.711+0000 7f4fc88de640 -1 osd.3 0 waiting for initial osdmap 2026-03-08T22:57:01.987 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:57:01.987 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:01.987 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:01.988 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:57:01.988 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:01.988 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:57:02.235 INFO:tasks.workunit.client.0.vm04.stdout:osd.3 up in weight 1 up_from 27 up_thru 28 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/3741043981,v1:127.0.0.1:6827/3741043981] [v2:127.0.0.1:6828/3741043981,v1:127.0.0.1:6829/3741043981] exists,up 0760ca8d-6623-4c83-9449-3435282c03df 2026-03-08T22:57:02.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:57:02.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:57:02.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:57:02.235 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: get_asok_path osd.0 2026-03-08T22:57:02.235 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:57:02.235 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:57:02.236 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:57:02.236 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:02.236 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:57:02.236 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.0.asok 2026-03-08T22:57:02.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: CEPH_ARGS= 2026-03-08T22:57:02.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.0.asok log flush 2026-03-08T22:57:02.291 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:57:02.297 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:56: setup_osds: grep 'load: jerasure.*lrc' td/test-erasure-eio/osd.0.log 2026-03-08T22:57:02.299 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:56:36.890+0000 7f993302c780 0 load: jerasure load: lrc 2026-03-08T22:57:02.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:319: TEST_rados_get_bad_size_shard_1: local poolname=pool-jerasure 2026-03-08T22:57:02.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:320: TEST_rados_get_bad_size_shard_1: create_erasure_coded_pool pool-jerasure 2 1 2026-03-08T22:57:02.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:67: create_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:57:02.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:68: create_erasure_coded_pool: shift 2026-03-08T22:57:02.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:69: create_erasure_coded_pool: local k=2 2026-03-08T22:57:02.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:70: create_erasure_coded_pool: shift 2026-03-08T22:57:02.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:71: create_erasure_coded_pool: local m=1 2026-03-08T22:57:02.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:72: create_erasure_coded_pool: shift 2026-03-08T22:57:02.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:74: create_erasure_coded_pool: ceph osd erasure-code-profile set myprofile plugin=jerasure k=2 m=1 crush-failure-domain=osd 2026-03-08T22:57:02.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:78: create_erasure_coded_pool: create_pool pool-jerasure 1 1 erasure myprofile 2026-03-08T22:57:02.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create pool-jerasure 1 1 erasure myprofile 2026-03-08T22:57:02.990 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' already exists 2026-03-08T22:57:02.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:57:04.000 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:80: create_erasure_coded_pool: wait_for_clean 2026-03-08T22:57:04.000 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:57:04.000 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:57:04.000 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:57:04.000 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:57:04.000 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:57:04.000 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:57:04.001 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:57:04.001 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:57:04.001 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:57:04.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:57:04.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:57:04.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:57:04.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:57:04.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:57:04.079 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:57:04.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:57:04.309 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:57:04.309 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:57:04.309 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:57:04.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:57:04.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:04.310 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:57:04.388 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803783 2026-03-08T22:57:04.389 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803783 2026-03-08T22:57:04.389 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783' 2026-03-08T22:57:04.389 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:04.389 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:57:04.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=51539607558 2026-03-08T22:57:04.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 51539607558 2026-03-08T22:57:04.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783 1-51539607558' 2026-03-08T22:57:04.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:04.466 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:57:04.543 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=77309411332 2026-03-08T22:57:04.543 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 77309411332 2026-03-08T22:57:04.543 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783 1-51539607558 2-77309411332' 2026-03-08T22:57:04.543 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:04.543 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:57:04.620 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964116995 2026-03-08T22:57:04.620 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964116995 2026-03-08T22:57:04.620 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783 1-51539607558 2-77309411332 3-115964116995' 2026-03-08T22:57:04.620 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:04.620 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803783 2026-03-08T22:57:04.620 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:04.621 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:57:04.622 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803783 2026-03-08T22:57:04.622 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:04.623 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803783 2026-03-08T22:57:04.623 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803783' 2026-03-08T22:57:04.623 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803783 2026-03-08T22:57:04.623 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:04.859 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803781 -lt 25769803783 2026-03-08T22:57:04.859 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:57:05.860 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:57:05.861 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:06.108 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803781 -lt 25769803783 2026-03-08T22:57:06.108 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:57:07.109 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:57:07.109 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:07.324 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803784 -lt 25769803783 2026-03-08T22:57:07.324 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:07.324 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-51539607558 2026-03-08T22:57:07.324 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:07.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:57:07.326 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-51539607558 2026-03-08T22:57:07.326 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:07.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=51539607558 2026-03-08T22:57:07.327 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 51539607558 2026-03-08T22:57:07.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 51539607558' 2026-03-08T22:57:07.327 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:57:07.534 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 51539607558 -lt 51539607558 2026-03-08T22:57:07.534 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:07.535 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-77309411332 2026-03-08T22:57:07.535 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:07.536 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:57:07.536 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-77309411332 2026-03-08T22:57:07.536 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:07.537 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=77309411332 2026-03-08T22:57:07.537 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 77309411332 2026-03-08T22:57:07.537 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 77309411332' 2026-03-08T22:57:07.537 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:57:07.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 77309411333 -lt 77309411332 2026-03-08T22:57:07.795 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:07.796 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964116995 2026-03-08T22:57:07.796 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:07.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:57:07.796 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964116995 2026-03-08T22:57:07.796 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:07.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964116995 2026-03-08T22:57:07.796 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 115964116995 2026-03-08T22:57:07.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964116995' 2026-03-08T22:57:07.796 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:57:08.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964116995 -lt 115964116995 2026-03-08T22:57:08.010 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:57:08.011 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:08.011 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:08.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:57:08.303 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:57:08.303 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:57:08.303 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:57:08.303 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:57:08.303 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:57:08.303 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:57:08.303 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:57:08.528 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:57:08.528 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:57:08.528 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:08.528 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:08.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:57:08.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:57:08.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:57:08.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:322: TEST_rados_get_bad_size_shard_1: local shard_id=1 2026-03-08T22:57:08.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:323: TEST_rados_get_bad_size_shard_1: rados_get_data_bad_size td/test-erasure-eio 1 10 2026-03-08T22:57:08.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:222: rados_get_data_bad_size: local dir=td/test-erasure-eio 2026-03-08T22:57:08.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:223: rados_get_data_bad_size: shift 2026-03-08T22:57:08.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:224: rados_get_data_bad_size: local shard_id=1 2026-03-08T22:57:08.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:225: rados_get_data_bad_size: shift 2026-03-08T22:57:08.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:226: rados_get_data_bad_size: local bytes=10 2026-03-08T22:57:08.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:227: rados_get_data_bad_size: shift 2026-03-08T22:57:08.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:228: rados_get_data_bad_size: local mode=set 2026-03-08T22:57:08.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:230: rados_get_data_bad_size: local poolname=pool-jerasure 2026-03-08T22:57:08.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:231: rados_get_data_bad_size: local objname=obj-size-102080-1-10 2026-03-08T22:57:08.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:232: rados_get_data_bad_size: rados_put td/test-erasure-eio pool-jerasure obj-size-102080-1-10 2026-03-08T22:57:08.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T22:57:08.821 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T22:57:08.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=obj-size-102080-1-10 2026-03-08T22:57:08.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:57:08.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T22:57:08.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:57:08.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T22:57:08.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:57:08.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T22:57:08.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:57:08.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T22:57:08.822 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put obj-size-102080-1-10 td/test-erasure-eio/ORIGINAL 2026-03-08T22:57:08.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:236: rados_get_data_bad_size: set_size obj-size-102080-1-10 td/test-erasure-eio 1 10 set 2026-03-08T22:57:08.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:192: set_size: local objname=obj-size-102080-1-10 2026-03-08T22:57:08.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:193: set_size: shift 2026-03-08T22:57:08.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:194: set_size: local dir=td/test-erasure-eio 2026-03-08T22:57:08.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:195: set_size: shift 2026-03-08T22:57:08.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:196: set_size: local shard_id=1 2026-03-08T22:57:08.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:197: set_size: shift 2026-03-08T22:57:08.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:198: set_size: local bytes=10 2026-03-08T22:57:08.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:199: set_size: shift 2026-03-08T22:57:08.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:200: set_size: local mode=set 2026-03-08T22:57:08.853 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:202: set_size: local poolname=pool-jerasure 2026-03-08T22:57:08.853 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: get_osds pool-jerasure obj-size-102080-1-10 2026-03-08T22:57:08.853 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:57:08.853 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-size-102080-1-10 2026-03-08T22:57:08.853 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-size-102080-1-10 2026-03-08T22:57:08.854 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:57:09.089 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:57:09.089 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:57:09.089 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T22:57:09.089 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T22:57:09.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: initial_osds=('3' '1' '2') 2026-03-08T22:57:09.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: local -a initial_osds 2026-03-08T22:57:09.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:204: set_size: local osd_id=1 2026-03-08T22:57:09.089 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:205: set_size: ceph osd set noout 2026-03-08T22:57:09.382 INFO:tasks.workunit.client.0.vm04.stderr:noout is set 2026-03-08T22:57:09.392 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:206: set_size: '[' set = add ']' 2026-03-08T22:57:09.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:210: set_size: '[' 10 = 0 ']' 2026-03-08T22:57:09.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:214: set_size: dd if=/dev/urandom bs=10 count=1 of=td/test-erasure-eio/CORRUPT 2026-03-08T22:57:09.393 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records in 2026-03-08T22:57:09.393 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records out 2026-03-08T22:57:09.393 INFO:tasks.workunit.client.0.vm04.stderr:10 bytes copied, 5.9671e-05 s, 168 kB/s 2026-03-08T22:57:09.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:216: set_size: objectstore_tool td/test-erasure-eio 1 obj-size-102080-1-10 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:09.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-eio 2026-03-08T22:57:09.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:57:09.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:57:09.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:57:09.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-eio 1 obj-size-102080-1-10 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:09.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-eio 2026-03-08T22:57:09.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:57:09.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:57:09.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:57:09.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-eio TERM osd.1 2026-03-08T22:57:09.394 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:57:09.394 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:57:09.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:57:09.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:57:09.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:57:09.501 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:57:09.501 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-eio 1 obj-size-102080-1-10 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:09.501 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-eio 2026-03-08T22:57:09.501 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:57:09.501 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:57:09.501 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:57:09.501 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-eio/1 2026-03-08T22:57:09.501 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-eio/1 obj-size-102080-1-10 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:10.613 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-eio 1 2026-03-08T22:57:10.613 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T22:57:10.613 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:57:10.613 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:57:10.613 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:57:10.613 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T22:57:10.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:57:10.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:57:10.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:57:10.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:57:10.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T22:57:10.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T22:57:10.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:57:10.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:57:10.614 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:57:10.614 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:57:10.614 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:10.614 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:10.614 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:10.614 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:10.614 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:57:10.614 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:57:10.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:57:10.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:57:10.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:57:10.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:57:10.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:57:10.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:57:10.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:57:10.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:57:10.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:57:10.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:57:10.615 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T22:57:10.616 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:57:10.616 INFO:tasks.workunit.client.0.vm04.stderr:start osd.1 2026-03-08T22:57:10.616 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:57:10.617 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/1/whoami 2026-03-08T22:57:10.617 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:57:10.618 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:57:10.619 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:57:10.621 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:57:10.635 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:10.636+0000 7fa06e80c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:10.643 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:10.643+0000 7fa06e80c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:10.643 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:10.645+0000 7fa06e80c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:10.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:57:10.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:57:10.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:57:10.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:57:10.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:57:10.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:10.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:57:10.847 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:57:10.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:10.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:11.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:11.710 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:11.711+0000 7fa06e80c780 -1 Falling back to public interface 2026-03-08T22:57:12.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:12.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:12.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:57:12.069 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:57:12.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:12.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:12.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:12.556 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:12.557+0000 7fa06e80c780 -1 osd.1 36 log_to_monitors true 2026-03-08T22:57:13.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:13.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:13.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:57:13.285 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:57:13.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:13.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:13.505 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:13.632 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:13.632+0000 7fa0651fb640 -1 osd.1 36 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:57:14.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:14.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:14.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:57:14.507 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:57:14.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:14.507 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:14.730 INFO:tasks.workunit.client.0.vm04.stderr:osd.1 up in weight 1 up_from 41 up_thru 41 down_at 38 last_clean_interval [12,36) [v2:127.0.0.1:6810/3798422003,v1:127.0.0.1:6811/3798422003] [v2:127.0.0.1:6812/3798422003,v1:127.0.0.1:6813/3798422003] exists,up 0649b5ed-f2a2-46d8-890a-68ec9ad1519c 2026-03-08T22:57:14.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:57:14.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:57:14.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:57:14.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:57:14.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:57:14.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:57:14.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:57:14.731 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:57:14.731 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:57:14.731 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:57:14.731 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:57:14.731 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:57:14.732 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:57:14.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:57:14.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:57:14.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:57:14.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:57:14.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:57:14.805 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:57:15.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:57:15.024 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:57:15.024 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:57:15.024 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:57:15.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:57:15.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:15.024 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:57:15.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803788 2026-03-08T22:57:15.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803788 2026-03-08T22:57:15.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803788' 2026-03-08T22:57:15.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:15.101 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:57:15.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=176093659139 2026-03-08T22:57:15.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 176093659139 2026-03-08T22:57:15.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803788 1-176093659139' 2026-03-08T22:57:15.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:15.176 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:57:15.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=77309411337 2026-03-08T22:57:15.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 77309411337 2026-03-08T22:57:15.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803788 1-176093659139 2-77309411337' 2026-03-08T22:57:15.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:15.247 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:57:15.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964116999 2026-03-08T22:57:15.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964116999 2026-03-08T22:57:15.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803788 1-176093659139 2-77309411337 3-115964116999' 2026-03-08T22:57:15.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:15.316 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803788 2026-03-08T22:57:15.316 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:15.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:57:15.318 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803788 2026-03-08T22:57:15.318 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:15.319 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803788 2026-03-08T22:57:15.319 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803788' 2026-03-08T22:57:15.319 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 25769803788 2026-03-08T22:57:15.319 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:15.525 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803785 -lt 25769803788 2026-03-08T22:57:15.525 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:57:16.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:57:16.526 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:16.743 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803788 -lt 25769803788 2026-03-08T22:57:16.743 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:16.743 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-176093659139 2026-03-08T22:57:16.744 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:16.745 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:57:16.745 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-176093659139 2026-03-08T22:57:16.745 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:16.746 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=176093659139 2026-03-08T22:57:16.747 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 176093659139' 2026-03-08T22:57:16.747 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 176093659139 2026-03-08T22:57:16.747 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:57:16.965 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 176093659139 -lt 176093659139 2026-03-08T22:57:16.965 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:16.966 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-77309411337 2026-03-08T22:57:16.966 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:16.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:57:16.967 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-77309411337 2026-03-08T22:57:16.967 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:16.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=77309411337 2026-03-08T22:57:16.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 77309411337' 2026-03-08T22:57:16.968 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 77309411337 2026-03-08T22:57:16.969 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:57:17.188 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 77309411337 -lt 77309411337 2026-03-08T22:57:17.188 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:17.189 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964116999 2026-03-08T22:57:17.189 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:17.190 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:57:17.190 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964116999 2026-03-08T22:57:17.190 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:17.191 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964116999 2026-03-08T22:57:17.191 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964116999' 2026-03-08T22:57:17.191 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 115964116999 2026-03-08T22:57:17.192 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:57:17.411 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964116999 -lt 115964116999 2026-03-08T22:57:17.411 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:57:17.412 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:17.412 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:17.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:57:17.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:57:17.700 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:57:17.700 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:57:17.700 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:57:17.700 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:57:17.700 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:57:17.700 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:57:17.914 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:57:17.915 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:57:17.915 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:17.915 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:18.198 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:57:18.198 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:57:18.198 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:57:18.198 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:217: set_size: rm -f td/test-erasure-eio/CORRUPT 2026-03-08T22:57:18.199 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:218: set_size: ceph osd unset noout 2026-03-08T22:57:18.535 INFO:tasks.workunit.client.0.vm04.stderr:noout is unset 2026-03-08T22:57:18.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:238: rados_get_data_bad_size: rados_get td/test-erasure-eio pool-jerasure obj-size-102080-1-10 2026-03-08T22:57:18.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:57:18.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:57:18.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=obj-size-102080-1-10 2026-03-08T22:57:18.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=ok 2026-03-08T22:57:18.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' ok = fail ']' 2026-03-08T22:57:18.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:120: rados_get: rados --pool pool-jerasure get obj-size-102080-1-10 td/test-erasure-eio/COPY 2026-03-08T22:57:18.580 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:121: rados_get: diff td/test-erasure-eio/ORIGINAL td/test-erasure-eio/COPY 2026-03-08T22:57:18.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:122: rados_get: rm td/test-erasure-eio/COPY 2026-03-08T22:57:18.582 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:241: rados_get_data_bad_size: expr 1 + 1 2026-03-08T22:57:18.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:241: rados_get_data_bad_size: shard_id=2 2026-03-08T22:57:18.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:242: rados_get_data_bad_size: set_size obj-size-102080-1-10 td/test-erasure-eio 2 10 set 2026-03-08T22:57:18.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:192: set_size: local objname=obj-size-102080-1-10 2026-03-08T22:57:18.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:193: set_size: shift 2026-03-08T22:57:18.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:194: set_size: local dir=td/test-erasure-eio 2026-03-08T22:57:18.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:195: set_size: shift 2026-03-08T22:57:18.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:196: set_size: local shard_id=2 2026-03-08T22:57:18.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:197: set_size: shift 2026-03-08T22:57:18.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:198: set_size: local bytes=10 2026-03-08T22:57:18.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:199: set_size: shift 2026-03-08T22:57:18.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:200: set_size: local mode=set 2026-03-08T22:57:18.584 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:202: set_size: local poolname=pool-jerasure 2026-03-08T22:57:18.584 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: get_osds pool-jerasure obj-size-102080-1-10 2026-03-08T22:57:18.584 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:57:18.584 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-size-102080-1-10 2026-03-08T22:57:18.585 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-size-102080-1-10 2026-03-08T22:57:18.585 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:57:18.807 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:57:18.807 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:57:18.807 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T22:57:18.807 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T22:57:18.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: initial_osds=('3' '1' '2') 2026-03-08T22:57:18.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: local -a initial_osds 2026-03-08T22:57:18.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:204: set_size: local osd_id=2 2026-03-08T22:57:18.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:205: set_size: ceph osd set noout 2026-03-08T22:57:19.070 INFO:tasks.workunit.client.0.vm04.stderr:noout is set 2026-03-08T22:57:19.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:206: set_size: '[' set = add ']' 2026-03-08T22:57:19.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:210: set_size: '[' 10 = 0 ']' 2026-03-08T22:57:19.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:214: set_size: dd if=/dev/urandom bs=10 count=1 of=td/test-erasure-eio/CORRUPT 2026-03-08T22:57:19.083 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records in 2026-03-08T22:57:19.083 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records out 2026-03-08T22:57:19.083 INFO:tasks.workunit.client.0.vm04.stderr:10 bytes copied, 0.00010647 s, 93.9 kB/s 2026-03-08T22:57:19.083 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:216: set_size: objectstore_tool td/test-erasure-eio 2 obj-size-102080-1-10 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:19.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-eio 2026-03-08T22:57:19.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:57:19.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=2 2026-03-08T22:57:19.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:57:19.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-eio 2 obj-size-102080-1-10 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:19.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-eio 2026-03-08T22:57:19.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:57:19.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=2 2026-03-08T22:57:19.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:57:19.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-eio TERM osd.2 2026-03-08T22:57:19.084 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:57:19.084 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:57:19.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:57:19.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:57:19.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:57:19.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:57:19.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-eio 2 obj-size-102080-1-10 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:19.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-eio 2026-03-08T22:57:19.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:57:19.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=2 2026-03-08T22:57:19.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:57:19.392 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-eio/2 2026-03-08T22:57:19.392 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-eio/2 obj-size-102080-1-10 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:20.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-eio 2 2026-03-08T22:57:20.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T22:57:20.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:57:20.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T22:57:20.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:57:20.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T22:57:20.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:57:20.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:57:20.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:57:20.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:57:20.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T22:57:20.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T22:57:20.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:57:20.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:57:20.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:57:20.511 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:57:20.511 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:20.511 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:20.512 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:20.512 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:20.512 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:57:20.512 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:57:20.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:57:20.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:57:20.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:57:20.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:57:20.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:57:20.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:57:20.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:57:20.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:57:20.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:57:20.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:57:20.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T22:57:20.514 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T22:57:20.514 INFO:tasks.workunit.client.0.vm04.stderr:start osd.2 2026-03-08T22:57:20.514 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:57:20.514 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/2/whoami 2026-03-08T22:57:20.515 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T22:57:20.515 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:57:20.516 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:57:20.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:57:20.531 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:20.532+0000 7f778441e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:20.538 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:20.539+0000 7f778441e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:20.539 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:20.540+0000 7f778441e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:20.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T22:57:20.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:57:20.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:57:20.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:57:20.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:57:20.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:20.730 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:57:20.730 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:57:20.731 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:20.731 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:20.939 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:21.095 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:21.096+0000 7f778441e780 -1 Falling back to public interface 2026-03-08T22:57:21.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:21.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:21.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:57:21.941 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:57:21.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:21.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:21.970 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:21.972+0000 7f778441e780 -1 osd.2 45 log_to_monitors true 2026-03-08T22:57:22.180 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:23.047 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:23.048+0000 7f777ad85640 -1 osd.2 45 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:57:23.182 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:23.182 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:23.182 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:57:23.182 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:57:23.182 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:23.182 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:23.390 INFO:tasks.workunit.client.0.vm04.stderr:osd.2 up in weight 1 up_from 50 up_thru 38 down_at 47 last_clean_interval [18,45) [v2:127.0.0.1:6818/4038447088,v1:127.0.0.1:6819/4038447088] [v2:127.0.0.1:6820/4038447088,v1:127.0.0.1:6821/4038447088] exists,up ae882e4b-3141-437c-b90b-795070ff9b8d 2026-03-08T22:57:23.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:57:23.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:57:23.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:57:23.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:57:23.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:57:23.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:57:23.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:57:23.391 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:57:23.391 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:57:23.391 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:57:23.391 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:57:23.391 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:57:23.391 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:57:23.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:57:23.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:57:23.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:57:23.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:57:23.460 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:57:23.460 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:57:23.676 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:57:23.676 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:57:23.676 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:57:23.677 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:57:23.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:57:23.677 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:23.677 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:57:23.745 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803791 2026-03-08T22:57:23.745 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803791 2026-03-08T22:57:23.745 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803791' 2026-03-08T22:57:23.745 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:23.745 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:57:23.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=176093659143 2026-03-08T22:57:23.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 176093659143 2026-03-08T22:57:23.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803791 1-176093659143' 2026-03-08T22:57:23.813 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:23.814 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:57:23.883 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=214748364803 2026-03-08T22:57:23.883 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 214748364803 2026-03-08T22:57:23.883 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803791 1-176093659143 2-214748364803' 2026-03-08T22:57:23.883 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:23.883 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:57:23.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964117003 2026-03-08T22:57:23.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964117003 2026-03-08T22:57:23.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803791 1-176093659143 2-214748364803 3-115964117003' 2026-03-08T22:57:23.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:23.951 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803791 2026-03-08T22:57:23.951 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:23.952 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:57:23.952 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803791 2026-03-08T22:57:23.952 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:23.953 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803791 2026-03-08T22:57:23.953 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803791' 2026-03-08T22:57:23.953 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 25769803791 2026-03-08T22:57:23.954 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:24.164 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803791 -lt 25769803791 2026-03-08T22:57:24.164 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:24.164 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-176093659143 2026-03-08T22:57:24.164 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:24.165 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:57:24.166 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-176093659143 2026-03-08T22:57:24.166 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:24.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=176093659143 2026-03-08T22:57:24.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 176093659143' 2026-03-08T22:57:24.167 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 176093659143 2026-03-08T22:57:24.168 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:57:24.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 176093659143 -lt 176093659143 2026-03-08T22:57:24.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:24.373 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-214748364803 2026-03-08T22:57:24.373 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:24.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:57:24.374 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-214748364803 2026-03-08T22:57:24.374 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:24.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=214748364803 2026-03-08T22:57:24.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 214748364803' 2026-03-08T22:57:24.375 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 214748364803 2026-03-08T22:57:24.375 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:57:24.580 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 214748364803 -lt 214748364803 2026-03-08T22:57:24.580 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:24.580 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964117003 2026-03-08T22:57:24.580 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:24.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:57:24.582 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964117003 2026-03-08T22:57:24.582 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:24.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964117003 2026-03-08T22:57:24.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964117003' 2026-03-08T22:57:24.583 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 115964117003 2026-03-08T22:57:24.583 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:57:24.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964117003 -lt 115964117003 2026-03-08T22:57:24.789 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:57:24.789 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:24.789 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:25.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:57:25.057 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:57:25.058 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:57:25.058 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:57:25.058 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:57:25.058 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:57:25.058 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:57:25.058 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:57:25.258 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:57:25.258 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:57:25.258 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:25.258 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:25.528 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 5 2026-03-08T22:57:25.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 1 '!=' -1 2026-03-08T22:57:25.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1674: wait_for_clean: loop=0 2026-03-08T22:57:25.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1675: wait_for_clean: num_active_clean=1 2026-03-08T22:57:25.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:57:25.529 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:57:25.629 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:57:25.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:57:25.630 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:57:25.630 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:57:25.630 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:57:25.630 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:57:25.630 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:57:25.630 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:57:25.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:57:25.829 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:57:25.829 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:25.829 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:26.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 5 2026-03-08T22:57:26.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 1 '!=' 1 2026-03-08T22:57:26.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:57:26.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:57:26.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:57:26.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:57:26.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:57:26.104 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:57:26.104 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:57:26.382 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=null 2026-03-08T22:57:26.382 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test null '!=' null 2026-03-08T22:57:26.382 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1678: wait_for_clean: (( 1 >= 13 )) 2026-03-08T22:57:26.382 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:57:26.382 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.2 2026-03-08T22:57:26.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:57:26.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:57:26.584 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:57:26.584 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:57:26.584 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:57:26.584 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:57:26.584 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:57:26.584 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:57:26.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:57:26.794 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:57:26.794 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:26.794 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:27.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 5 2026-03-08T22:57:27.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 1 '!=' 1 2026-03-08T22:57:27.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:57:27.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:57:27.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:57:27.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:57:27.078 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:57:27.078 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:57:27.079 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:57:27.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=null 2026-03-08T22:57:27.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test null '!=' null 2026-03-08T22:57:27.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1678: wait_for_clean: (( 2 >= 13 )) 2026-03-08T22:57:27.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:57:27.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.4 2026-03-08T22:57:27.758 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:57:27.758 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:57:27.758 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:57:27.758 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:57:27.758 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:57:27.758 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:57:27.759 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:57:27.759 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:57:27.964 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:57:27.964 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:57:27.965 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:27.965 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:28.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 5 2026-03-08T22:57:28.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 1 '!=' 1 2026-03-08T22:57:28.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:57:28.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:57:28.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:57:28.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:57:28.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:57:28.235 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:57:28.235 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:57:28.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=null 2026-03-08T22:57:28.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test null '!=' null 2026-03-08T22:57:28.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1678: wait_for_clean: (( 3 >= 13 )) 2026-03-08T22:57:28.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:57:28.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.8 2026-03-08T22:57:29.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:57:29.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:57:29.321 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:57:29.321 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:57:29.321 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:57:29.321 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:57:29.321 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:57:29.322 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:57:29.536 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:57:29.537 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:57:29.537 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:29.537 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:29.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:57:29.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:57:29.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:57:29.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:217: set_size: rm -f td/test-erasure-eio/CORRUPT 2026-03-08T22:57:29.818 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:218: set_size: ceph osd unset noout 2026-03-08T22:57:30.076 INFO:tasks.workunit.client.0.vm04.stderr:noout is unset 2026-03-08T22:57:30.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:243: rados_get_data_bad_size: rados_get td/test-erasure-eio pool-jerasure obj-size-102080-1-10 fail 2026-03-08T22:57:30.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:57:30.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:57:30.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=obj-size-102080-1-10 2026-03-08T22:57:30.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=fail 2026-03-08T22:57:30.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' fail = fail ']' 2026-03-08T22:57:30.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:114: rados_get: rados --pool pool-jerasure get obj-size-102080-1-10 td/test-erasure-eio/COPY 2026-03-08T22:57:30.109 INFO:tasks.workunit.client.0.vm04.stderr:error getting pool-jerasure/obj-size-102080-1-10: (5) Input/output error 2026-03-08T22:57:30.112 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:115: rados_get: return 2026-03-08T22:57:30.112 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:244: rados_get_data_bad_size: rm td/test-erasure-eio/ORIGINAL 2026-03-08T22:57:30.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:324: TEST_rados_get_bad_size_shard_1: rados_get_data_bad_size td/test-erasure-eio 1 0 2026-03-08T22:57:30.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:222: rados_get_data_bad_size: local dir=td/test-erasure-eio 2026-03-08T22:57:30.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:223: rados_get_data_bad_size: shift 2026-03-08T22:57:30.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:224: rados_get_data_bad_size: local shard_id=1 2026-03-08T22:57:30.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:225: rados_get_data_bad_size: shift 2026-03-08T22:57:30.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:226: rados_get_data_bad_size: local bytes=0 2026-03-08T22:57:30.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:227: rados_get_data_bad_size: shift 2026-03-08T22:57:30.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:228: rados_get_data_bad_size: local mode=set 2026-03-08T22:57:30.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:230: rados_get_data_bad_size: local poolname=pool-jerasure 2026-03-08T22:57:30.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:231: rados_get_data_bad_size: local objname=obj-size-102080-1-0 2026-03-08T22:57:30.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:232: rados_get_data_bad_size: rados_put td/test-erasure-eio pool-jerasure obj-size-102080-1-0 2026-03-08T22:57:30.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T22:57:30.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T22:57:30.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=obj-size-102080-1-0 2026-03-08T22:57:30.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:57:30.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T22:57:30.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:57:30.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T22:57:30.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:57:30.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T22:57:30.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:57:30.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T22:57:30.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put obj-size-102080-1-0 td/test-erasure-eio/ORIGINAL 2026-03-08T22:57:30.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:236: rados_get_data_bad_size: set_size obj-size-102080-1-0 td/test-erasure-eio 1 0 set 2026-03-08T22:57:30.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:192: set_size: local objname=obj-size-102080-1-0 2026-03-08T22:57:30.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:193: set_size: shift 2026-03-08T22:57:30.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:194: set_size: local dir=td/test-erasure-eio 2026-03-08T22:57:30.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:195: set_size: shift 2026-03-08T22:57:30.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:196: set_size: local shard_id=1 2026-03-08T22:57:30.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:197: set_size: shift 2026-03-08T22:57:30.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:198: set_size: local bytes=0 2026-03-08T22:57:30.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:199: set_size: shift 2026-03-08T22:57:30.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:200: set_size: local mode=set 2026-03-08T22:57:30.138 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:202: set_size: local poolname=pool-jerasure 2026-03-08T22:57:30.138 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: get_osds pool-jerasure obj-size-102080-1-0 2026-03-08T22:57:30.138 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:57:30.138 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-size-102080-1-0 2026-03-08T22:57:30.139 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-size-102080-1-0 2026-03-08T22:57:30.139 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:57:30.366 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:57:30.366 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:57:30.366 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T22:57:30.366 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T22:57:30.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: initial_osds=('3' '1' '2') 2026-03-08T22:57:30.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: local -a initial_osds 2026-03-08T22:57:30.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:204: set_size: local osd_id=1 2026-03-08T22:57:30.367 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:205: set_size: ceph osd set noout 2026-03-08T22:57:30.695 INFO:tasks.workunit.client.0.vm04.stderr:noout is set 2026-03-08T22:57:30.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:206: set_size: '[' set = add ']' 2026-03-08T22:57:30.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:210: set_size: '[' 0 = 0 ']' 2026-03-08T22:57:30.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:212: set_size: touch td/test-erasure-eio/CORRUPT 2026-03-08T22:57:30.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:216: set_size: objectstore_tool td/test-erasure-eio 1 obj-size-102080-1-0 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:30.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-eio 2026-03-08T22:57:30.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:57:30.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:57:30.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:57:30.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-eio 1 obj-size-102080-1-0 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:30.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-eio 2026-03-08T22:57:30.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:57:30.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:57:30.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:57:30.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-eio TERM osd.1 2026-03-08T22:57:30.707 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:57:30.707 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:57:30.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:57:30.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:57:30.708 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:57:31.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:57:31.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-eio 1 obj-size-102080-1-0 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:31.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-eio 2026-03-08T22:57:31.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:57:31.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:57:31.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:57:31.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-eio/1 2026-03-08T22:57:31.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-eio/1 obj-size-102080-1-0 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:32.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-eio 1 2026-03-08T22:57:32.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T22:57:32.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:57:32.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:57:32.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:57:32.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T22:57:32.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:57:32.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:57:32.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:57:32.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:57:32.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T22:57:32.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T22:57:32.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:57:32.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:57:32.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:57:32.125 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:57:32.125 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:32.125 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:32.125 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:32.125 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:32.125 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:57:32.125 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:57:32.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:57:32.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:57:32.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:57:32.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:57:32.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:57:32.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:57:32.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:57:32.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:57:32.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:57:32.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:57:32.127 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T22:57:32.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:57:32.128 INFO:tasks.workunit.client.0.vm04.stderr:start osd.1 2026-03-08T22:57:32.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:57:32.128 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/1/whoami 2026-03-08T22:57:32.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:57:32.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:57:32.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:57:32.132 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:57:32.146 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:32.147+0000 7f3cdee0f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:32.150 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:32.151+0000 7f3cdee0f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:32.151 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:32.152+0000 7f3cdee0f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:32.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:57:32.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:57:32.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:57:32.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:57:32.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:57:32.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:32.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:57:32.348 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:57:32.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:32.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:32.575 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:33.217 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:33.218+0000 7f3cdee0f780 -1 Falling back to public interface 2026-03-08T22:57:33.576 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:33.576 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:33.576 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:57:33.576 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:57:33.578 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:33.578 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:33.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:34.052 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:34.054+0000 7f3cdee0f780 -1 osd.1 53 log_to_monitors true 2026-03-08T22:57:34.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:34.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:34.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:57:34.793 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:57:34.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:34.793 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:34.929 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:34.930+0000 7f3cd5842640 -1 osd.1 53 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:57:35.047 INFO:tasks.workunit.client.0.vm04.stderr:osd.1 up in weight 1 up_from 59 up_thru 50 down_at 56 last_clean_interval [41,53) [v2:127.0.0.1:6810/2480858484,v1:127.0.0.1:6811/2480858484] [v2:127.0.0.1:6812/2480858484,v1:127.0.0.1:6813/2480858484] exists,up 0649b5ed-f2a2-46d8-890a-68ec9ad1519c 2026-03-08T22:57:35.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:57:35.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:57:35.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:57:35.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:57:35.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:57:35.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:57:35.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:57:35.048 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:57:35.048 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:57:35.048 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:57:35.048 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:57:35.048 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:57:35.048 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:57:35.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:57:35.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:57:35.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:57:35.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:57:35.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:57:35.129 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:57:35.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:57:35.339 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:57:35.339 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:57:35.339 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:57:35.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:57:35.339 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:35.339 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:57:35.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803796 2026-03-08T22:57:35.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803796 2026-03-08T22:57:35.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803796' 2026-03-08T22:57:35.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:35.413 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:57:35.488 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=253403070467 2026-03-08T22:57:35.488 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 253403070467 2026-03-08T22:57:35.488 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803796 1-253403070467' 2026-03-08T22:57:35.488 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:35.488 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:57:35.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=214748364807 2026-03-08T22:57:35.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 214748364807 2026-03-08T22:57:35.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803796 1-253403070467 2-214748364807' 2026-03-08T22:57:35.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:35.559 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:57:35.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964117007 2026-03-08T22:57:35.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964117007 2026-03-08T22:57:35.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803796 1-253403070467 2-214748364807 3-115964117007' 2026-03-08T22:57:35.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:35.630 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803796 2026-03-08T22:57:35.630 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:35.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:57:35.631 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803796 2026-03-08T22:57:35.631 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:35.632 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803796 2026-03-08T22:57:35.632 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803796' 2026-03-08T22:57:35.632 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 25769803796 2026-03-08T22:57:35.632 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:35.840 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803793 -lt 25769803796 2026-03-08T22:57:35.840 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:57:36.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:57:36.842 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:37.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803796 -lt 25769803796 2026-03-08T22:57:37.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:37.063 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-253403070467 2026-03-08T22:57:37.063 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:37.064 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:57:37.065 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-253403070467 2026-03-08T22:57:37.065 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:37.066 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=253403070467 2026-03-08T22:57:37.066 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 253403070467' 2026-03-08T22:57:37.066 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 253403070467 2026-03-08T22:57:37.066 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:57:37.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 253403070467 -lt 253403070467 2026-03-08T22:57:37.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:37.279 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-214748364807 2026-03-08T22:57:37.280 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:37.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:57:37.281 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-214748364807 2026-03-08T22:57:37.281 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:37.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=214748364807 2026-03-08T22:57:37.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 214748364807' 2026-03-08T22:57:37.282 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 214748364807 2026-03-08T22:57:37.282 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:57:37.509 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 214748364807 -lt 214748364807 2026-03-08T22:57:37.510 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:37.510 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964117007 2026-03-08T22:57:37.510 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:37.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:57:37.512 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964117007 2026-03-08T22:57:37.512 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:37.512 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964117007 2026-03-08T22:57:37.512 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964117007' 2026-03-08T22:57:37.513 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 115964117007 2026-03-08T22:57:37.513 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:57:37.731 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964117007 -lt 115964117007 2026-03-08T22:57:37.732 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:57:37.732 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:37.732 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:57:38.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:57:38.234 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:57:38.234 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:38.234 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:38.522 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:57:38.522 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:57:38.522 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:57:38.522 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:217: set_size: rm -f td/test-erasure-eio/CORRUPT 2026-03-08T22:57:38.522 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:218: set_size: ceph osd unset noout 2026-03-08T22:57:38.832 INFO:tasks.workunit.client.0.vm04.stderr:noout is unset 2026-03-08T22:57:38.844 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:238: rados_get_data_bad_size: rados_get td/test-erasure-eio pool-jerasure obj-size-102080-1-0 2026-03-08T22:57:38.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:57:38.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:57:38.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=obj-size-102080-1-0 2026-03-08T22:57:38.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=ok 2026-03-08T22:57:38.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' ok = fail ']' 2026-03-08T22:57:38.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:120: rados_get: rados --pool pool-jerasure get obj-size-102080-1-0 td/test-erasure-eio/COPY 2026-03-08T22:57:38.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:121: rados_get: diff td/test-erasure-eio/ORIGINAL td/test-erasure-eio/COPY 2026-03-08T22:57:38.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:122: rados_get: rm td/test-erasure-eio/COPY 2026-03-08T22:57:38.873 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:241: rados_get_data_bad_size: expr 1 + 1 2026-03-08T22:57:38.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:241: rados_get_data_bad_size: shard_id=2 2026-03-08T22:57:38.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:242: rados_get_data_bad_size: set_size obj-size-102080-1-0 td/test-erasure-eio 2 0 set 2026-03-08T22:57:38.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:192: set_size: local objname=obj-size-102080-1-0 2026-03-08T22:57:38.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:193: set_size: shift 2026-03-08T22:57:38.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:194: set_size: local dir=td/test-erasure-eio 2026-03-08T22:57:38.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:195: set_size: shift 2026-03-08T22:57:38.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:196: set_size: local shard_id=2 2026-03-08T22:57:38.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:197: set_size: shift 2026-03-08T22:57:38.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:198: set_size: local bytes=0 2026-03-08T22:57:38.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:199: set_size: shift 2026-03-08T22:57:38.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:200: set_size: local mode=set 2026-03-08T22:57:38.875 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:202: set_size: local poolname=pool-jerasure 2026-03-08T22:57:38.875 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: get_osds pool-jerasure obj-size-102080-1-0 2026-03-08T22:57:38.875 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:57:38.875 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-size-102080-1-0 2026-03-08T22:57:38.876 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-size-102080-1-0 2026-03-08T22:57:38.876 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:57:39.096 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:57:39.096 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:57:39.096 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T22:57:39.096 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T22:57:39.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: initial_osds=('3' '1' '2') 2026-03-08T22:57:39.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: local -a initial_osds 2026-03-08T22:57:39.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:204: set_size: local osd_id=2 2026-03-08T22:57:39.096 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:205: set_size: ceph osd set noout 2026-03-08T22:57:39.405 INFO:tasks.workunit.client.0.vm04.stderr:noout is set 2026-03-08T22:57:39.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:206: set_size: '[' set = add ']' 2026-03-08T22:57:39.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:210: set_size: '[' 0 = 0 ']' 2026-03-08T22:57:39.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:212: set_size: touch td/test-erasure-eio/CORRUPT 2026-03-08T22:57:39.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:216: set_size: objectstore_tool td/test-erasure-eio 2 obj-size-102080-1-0 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:39.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-eio 2026-03-08T22:57:39.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:57:39.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=2 2026-03-08T22:57:39.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:57:39.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-eio 2 obj-size-102080-1-0 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:39.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-eio 2026-03-08T22:57:39.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:57:39.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=2 2026-03-08T22:57:39.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:57:39.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-eio TERM osd.2 2026-03-08T22:57:39.418 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:57:39.418 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:57:39.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:57:39.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:57:39.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:57:39.524 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:57:39.524 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-eio 2 obj-size-102080-1-0 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:39.524 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-eio 2026-03-08T22:57:39.524 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:57:39.524 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=2 2026-03-08T22:57:39.524 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:57:39.524 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-eio/2 2026-03-08T22:57:39.524 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-eio/2 obj-size-102080-1-0 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-eio 2 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:57:40.625 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:57:40.626 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:57:40.626 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:57:40.626 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:57:40.626 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:57:40.626 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:57:40.626 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:57:40.626 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:57:40.626 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:57:40.626 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:57:40.626 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:57:40.626 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T22:57:40.627 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T22:57:40.627 INFO:tasks.workunit.client.0.vm04.stderr:start osd.2 2026-03-08T22:57:40.627 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:57:40.628 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/2/whoami 2026-03-08T22:57:40.628 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T22:57:40.629 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:57:40.629 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:57:40.631 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:57:40.647 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:40.648+0000 7f5cca153780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:40.650 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:40.651+0000 7f5cca153780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:40.651 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:40.652+0000 7f5cca153780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:40.838 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T22:57:40.838 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:57:40.838 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:57:40.838 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:57:40.838 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:57:40.838 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:40.838 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:57:40.838 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:57:40.838 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:40.838 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:41.053 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:41.459 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:41.460+0000 7f5cca153780 -1 Falling back to public interface 2026-03-08T22:57:42.054 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:42.054 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:42.054 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:57:42.054 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:57:42.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:42.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:42.271 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:42.363 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:42.364+0000 7f5cca153780 -1 osd.2 63 log_to_monitors true 2026-03-08T22:57:43.263 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:43.264+0000 7f5cc10e9640 -1 osd.2 63 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:57:43.272 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:43.272 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:43.272 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:57:43.272 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:57:43.273 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:43.273 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:43.512 INFO:tasks.workunit.client.0.vm04.stderr:osd.2 up in weight 1 up_from 68 up_thru 56 down_at 65 last_clean_interval [50,63) [v2:127.0.0.1:6818/1227893059,v1:127.0.0.1:6819/1227893059] [v2:127.0.0.1:6820/1227893059,v1:127.0.0.1:6821/1227893059] exists,up ae882e4b-3141-437c-b90b-795070ff9b8d 2026-03-08T22:57:43.512 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:57:43.512 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:57:43.512 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:57:43.512 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:57:43.512 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:57:43.512 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:57:43.512 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:57:43.512 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:57:43.513 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:57:43.513 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:57:43.513 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:57:43.513 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:57:43.513 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:57:43.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:57:43.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:57:43.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:57:43.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:57:43.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:57:43.582 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:57:43.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:57:43.789 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:57:43.789 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:57:43.789 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:57:43.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:57:43.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:43.789 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:57:43.861 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803799 2026-03-08T22:57:43.861 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803799 2026-03-08T22:57:43.861 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803799' 2026-03-08T22:57:43.861 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:43.861 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:57:43.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=253403070470 2026-03-08T22:57:43.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 253403070470 2026-03-08T22:57:43.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803799 1-253403070470' 2026-03-08T22:57:43.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:43.936 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:57:44.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776131 2026-03-08T22:57:44.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776131 2026-03-08T22:57:44.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803799 1-253403070470 2-292057776131' 2026-03-08T22:57:44.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:44.008 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:57:44.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964117011 2026-03-08T22:57:44.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964117011 2026-03-08T22:57:44.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803799 1-253403070470 2-292057776131 3-115964117011' 2026-03-08T22:57:44.082 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:44.082 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803799 2026-03-08T22:57:44.083 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:44.084 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:57:44.084 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803799 2026-03-08T22:57:44.084 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:44.085 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803799 2026-03-08T22:57:44.085 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803799' 2026-03-08T22:57:44.085 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 25769803799 2026-03-08T22:57:44.085 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:44.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803799 -lt 25769803799 2026-03-08T22:57:44.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:44.300 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-253403070470 2026-03-08T22:57:44.300 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:44.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:57:44.302 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-253403070470 2026-03-08T22:57:44.302 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:44.303 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=253403070470 2026-03-08T22:57:44.303 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 253403070470' 2026-03-08T22:57:44.303 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 253403070470 2026-03-08T22:57:44.304 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:57:44.521 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 253403070470 -lt 253403070470 2026-03-08T22:57:44.521 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:44.521 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-292057776131 2026-03-08T22:57:44.521 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:44.522 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:57:44.523 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-292057776131 2026-03-08T22:57:44.523 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:44.523 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776131 2026-03-08T22:57:44.523 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 292057776131' 2026-03-08T22:57:44.523 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 292057776131 2026-03-08T22:57:44.524 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:57:44.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776131 -lt 292057776131 2026-03-08T22:57:44.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:44.734 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964117011 2026-03-08T22:57:44.734 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:44.735 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:57:44.736 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964117011 2026-03-08T22:57:44.736 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:44.737 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964117011 2026-03-08T22:57:44.737 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964117011' 2026-03-08T22:57:44.737 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 115964117011 2026-03-08T22:57:44.737 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:57:44.940 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964117011 -lt 115964117011 2026-03-08T22:57:44.940 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:57:44.940 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:44.941 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:45.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:57:45.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:57:45.215 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:57:45.215 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:57:45.215 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:57:45.215 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:57:45.215 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:57:45.215 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:57:45.421 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:57:45.421 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:57:45.421 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:45.421 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:45.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:57:45.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:57:45.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:57:45.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:217: set_size: rm -f td/test-erasure-eio/CORRUPT 2026-03-08T22:57:45.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:218: set_size: ceph osd unset noout 2026-03-08T22:57:45.962 INFO:tasks.workunit.client.0.vm04.stderr:noout is unset 2026-03-08T22:57:45.973 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:243: rados_get_data_bad_size: rados_get td/test-erasure-eio pool-jerasure obj-size-102080-1-0 fail 2026-03-08T22:57:45.973 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:57:45.973 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:57:45.973 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=obj-size-102080-1-0 2026-03-08T22:57:45.973 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=fail 2026-03-08T22:57:45.973 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' fail = fail ']' 2026-03-08T22:57:45.973 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:114: rados_get: rados --pool pool-jerasure get obj-size-102080-1-0 td/test-erasure-eio/COPY 2026-03-08T22:57:45.995 INFO:tasks.workunit.client.0.vm04.stderr:error getting pool-jerasure/obj-size-102080-1-0: (5) Input/output error 2026-03-08T22:57:45.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:115: rados_get: return 2026-03-08T22:57:45.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:244: rados_get_data_bad_size: rm td/test-erasure-eio/ORIGINAL 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:325: TEST_rados_get_bad_size_shard_1: rados_get_data_bad_size td/test-erasure-eio 1 256 add 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:222: rados_get_data_bad_size: local dir=td/test-erasure-eio 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:223: rados_get_data_bad_size: shift 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:224: rados_get_data_bad_size: local shard_id=1 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:225: rados_get_data_bad_size: shift 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:226: rados_get_data_bad_size: local bytes=256 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:227: rados_get_data_bad_size: shift 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:228: rados_get_data_bad_size: local mode=add 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:230: rados_get_data_bad_size: local poolname=pool-jerasure 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:231: rados_get_data_bad_size: local objname=obj-size-102080-1-256 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:232: rados_get_data_bad_size: rados_put td/test-erasure-eio pool-jerasure obj-size-102080-1-256 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=obj-size-102080-1-256 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T22:57:45.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put obj-size-102080-1-256 td/test-erasure-eio/ORIGINAL 2026-03-08T22:57:46.023 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:236: rados_get_data_bad_size: set_size obj-size-102080-1-256 td/test-erasure-eio 1 256 add 2026-03-08T22:57:46.023 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:192: set_size: local objname=obj-size-102080-1-256 2026-03-08T22:57:46.023 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:193: set_size: shift 2026-03-08T22:57:46.023 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:194: set_size: local dir=td/test-erasure-eio 2026-03-08T22:57:46.023 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:195: set_size: shift 2026-03-08T22:57:46.023 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:196: set_size: local shard_id=1 2026-03-08T22:57:46.023 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:197: set_size: shift 2026-03-08T22:57:46.023 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:198: set_size: local bytes=256 2026-03-08T22:57:46.023 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:199: set_size: shift 2026-03-08T22:57:46.023 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:200: set_size: local mode=add 2026-03-08T22:57:46.023 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:202: set_size: local poolname=pool-jerasure 2026-03-08T22:57:46.023 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: get_osds pool-jerasure obj-size-102080-1-256 2026-03-08T22:57:46.024 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:57:46.024 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-size-102080-1-256 2026-03-08T22:57:46.024 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-size-102080-1-256 2026-03-08T22:57:46.024 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:57:46.233 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:57:46.233 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:57:46.233 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T22:57:46.233 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T22:57:46.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: initial_osds=('3' '1' '2') 2026-03-08T22:57:46.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: local -a initial_osds 2026-03-08T22:57:46.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:204: set_size: local osd_id=1 2026-03-08T22:57:46.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:205: set_size: ceph osd set noout 2026-03-08T22:57:46.571 INFO:tasks.workunit.client.0.vm04.stderr:noout is set 2026-03-08T22:57:46.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:206: set_size: '[' add = add ']' 2026-03-08T22:57:46.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:208: set_size: objectstore_tool td/test-erasure-eio 1 obj-size-102080-1-256 get-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:46.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-eio 2026-03-08T22:57:46.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:57:46.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:57:46.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:57:46.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-eio 1 obj-size-102080-1-256 get-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:46.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-eio 2026-03-08T22:57:46.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:57:46.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:57:46.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:57:46.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-eio TERM osd.1 2026-03-08T22:57:46.583 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:57:46.583 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:57:46.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:57:46.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:57:46.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:57:46.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:57:46.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-eio 1 obj-size-102080-1-256 get-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:46.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-eio 2026-03-08T22:57:46.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:57:46.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:57:46.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:57:46.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-eio/1 2026-03-08T22:57:46.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-eio/1 obj-size-102080-1-256 get-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-eio 1 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:57:47.275 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:57:47.276 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:57:47.276 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:57:47.276 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:57:47.276 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:57:47.276 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:57:47.276 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:57:47.276 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:57:47.276 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:57:47.276 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:57:47.276 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:57:47.276 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T22:57:47.277 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:57:47.277 INFO:tasks.workunit.client.0.vm04.stderr:start osd.1 2026-03-08T22:57:47.277 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:57:47.277 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/1/whoami 2026-03-08T22:57:47.278 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:57:47.279 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:57:47.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:57:47.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:57:47.298 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:47.298+0000 7f05c380c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:47.305 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:47.306+0000 7f05c380c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:47.306 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:47.307+0000 7f05c380c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:47.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:57:47.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:57:47.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:57:47.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:57:47.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:57:47.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:47.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:57:47.493 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:57:47.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:47.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:47.711 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:48.118 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:48.120+0000 7f05c380c780 -1 Falling back to public interface 2026-03-08T22:57:48.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:48.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:48.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:57:48.713 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:57:48.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:48.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:48.945 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:49.251 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:49.252+0000 7f05c380c780 -1 osd.1 73 log_to_monitors true 2026-03-08T22:57:49.946 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:49.946 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:49.946 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:57:49.947 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:57:49.947 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:49.947 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:50.193 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:51.195 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:51.195 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:51.195 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:57:51.195 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:57:51.195 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:51.195 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:51.410 INFO:tasks.workunit.client.0.vm04.stderr:osd.1 up in weight 1 up_from 77 up_thru 77 down_at 74 last_clean_interval [59,73) [v2:127.0.0.1:6810/2676329777,v1:127.0.0.1:6811/2676329777] [v2:127.0.0.1:6812/2676329777,v1:127.0.0.1:6813/2676329777] exists,up 0649b5ed-f2a2-46d8-890a-68ec9ad1519c 2026-03-08T22:57:51.410 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:57:51.410 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:57:51.410 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:57:51.410 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:57:51.410 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:57:51.410 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:57:51.411 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:57:51.411 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:57:51.411 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:57:51.411 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:57:51.412 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:57:51.412 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:57:51.412 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:57:51.490 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:57:51.490 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:57:51.490 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:57:51.490 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:57:51.490 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:57:51.490 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:57:51.703 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:57:51.703 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:57:51.703 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:57:51.703 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:57:51.703 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:57:51.703 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:51.703 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:57:51.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803803 2026-03-08T22:57:51.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803803 2026-03-08T22:57:51.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803803' 2026-03-08T22:57:51.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:51.773 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:57:51.842 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=330712481795 2026-03-08T22:57:51.843 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 330712481795 2026-03-08T22:57:51.843 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803803 1-330712481795' 2026-03-08T22:57:51.843 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:51.843 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:57:51.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776134 2026-03-08T22:57:51.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776134 2026-03-08T22:57:51.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803803 1-330712481795 2-292057776134' 2026-03-08T22:57:51.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:51.919 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:57:51.990 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964117014 2026-03-08T22:57:51.990 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964117014 2026-03-08T22:57:51.990 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803803 1-330712481795 2-292057776134 3-115964117014' 2026-03-08T22:57:51.990 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:51.991 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803803 2026-03-08T22:57:51.991 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:51.992 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:57:51.993 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803803 2026-03-08T22:57:51.993 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:51.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803803 2026-03-08T22:57:51.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803803' 2026-03-08T22:57:51.994 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 25769803803 2026-03-08T22:57:51.994 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:52.221 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803803 -lt 25769803803 2026-03-08T22:57:52.221 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:52.221 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-330712481795 2026-03-08T22:57:52.221 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:52.222 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:57:52.222 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-330712481795 2026-03-08T22:57:52.223 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:52.223 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=330712481795 2026-03-08T22:57:52.224 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 330712481795' 2026-03-08T22:57:52.224 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 330712481795 2026-03-08T22:57:52.224 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:57:52.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 330712481795 -lt 330712481795 2026-03-08T22:57:52.448 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:52.448 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-292057776134 2026-03-08T22:57:52.448 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:52.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:57:52.450 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-292057776134 2026-03-08T22:57:52.450 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:52.451 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776134 2026-03-08T22:57:52.451 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 292057776134' 2026-03-08T22:57:52.452 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 292057776134 2026-03-08T22:57:52.452 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:57:52.683 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776134 -lt 292057776134 2026-03-08T22:57:52.683 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:52.684 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964117014 2026-03-08T22:57:52.684 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:52.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:57:52.686 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964117014 2026-03-08T22:57:52.686 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:52.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964117014 2026-03-08T22:57:52.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964117014' 2026-03-08T22:57:52.687 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 115964117014 2026-03-08T22:57:52.687 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:57:52.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964117014 -lt 115964117014 2026-03-08T22:57:52.907 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:57:52.908 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:52.908 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:53.190 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:57:53.190 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:57:53.191 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:57:53.191 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:57:53.191 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:57:53.191 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:57:53.191 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:57:53.191 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:57:53.411 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:57:53.411 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:57:53.412 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:53.412 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:53.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:57:53.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:57:53.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:57:53.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:209: set_size: dd if=/dev/urandom bs=256 count=1 2026-03-08T22:57:53.699 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records in 2026-03-08T22:57:53.699 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records out 2026-03-08T22:57:53.699 INFO:tasks.workunit.client.0.vm04.stderr:256 bytes copied, 5.5895e-05 s, 4.6 MB/s 2026-03-08T22:57:53.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:216: set_size: objectstore_tool td/test-erasure-eio 1 obj-size-102080-1-256 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:53.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-eio 2026-03-08T22:57:53.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:57:53.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:57:53.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:57:53.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-eio 1 obj-size-102080-1-256 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:53.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-eio 2026-03-08T22:57:53.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:57:53.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:57:53.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:57:53.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-eio TERM osd.1 2026-03-08T22:57:53.699 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:57:53.699 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:57:53.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:57:53.700 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:57:53.700 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:57:53.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:57:53.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-eio 1 obj-size-102080-1-256 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:53.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-eio 2026-03-08T22:57:53.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:57:53.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:57:53.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:57:53.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-eio/1 2026-03-08T22:57:53.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-eio/1 obj-size-102080-1-256 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:57:54.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-eio 1 2026-03-08T22:57:54.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T22:57:54.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:57:54.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:57:54.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:57:54.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T22:57:54.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:57:54.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:57:54.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:57:54.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:57:54.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T22:57:54.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T22:57:54.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:57:54.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:57:54.962 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:57:54.963 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:57:54.963 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:54.963 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:54.963 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:54.963 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:54.963 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:57:54.963 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:57:54.963 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:57:54.963 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:57:54.963 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:57:54.963 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:57:54.963 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:57:54.963 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:57:54.963 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:57:54.963 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:57:54.963 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:57:54.963 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:57:54.963 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T22:57:54.964 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:57:54.964 INFO:tasks.workunit.client.0.vm04.stderr:start osd.1 2026-03-08T22:57:54.964 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:57:54.965 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/1/whoami 2026-03-08T22:57:54.965 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:57:54.966 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:57:54.967 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:57:54.968 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:57:54.983 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:54.984+0000 7fca36ac6780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:54.990 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:54.991+0000 7fca36ac6780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:54.991 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:54.992+0000 7fca36ac6780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:55.189 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:57:55.189 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:57:55.189 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:57:55.189 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:57:55.189 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:57:55.189 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:55.189 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:57:55.189 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:57:55.189 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:55.189 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:55.404 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:55.797 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:55.797+0000 7fca36ac6780 -1 Falling back to public interface 2026-03-08T22:57:56.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:56.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:56.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:57:56.407 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:57:56.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:56.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:56.621 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:56.665 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:57:56.666+0000 7fca36ac6780 -1 osd.1 78 log_to_monitors true 2026-03-08T22:57:57.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:57.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:57.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:57:57.624 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:57:57.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:57.624 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:57.840 INFO:tasks.workunit.client.0.vm04.stderr:osd.1 up in weight 1 up_from 82 up_thru 82 down_at 79 last_clean_interval [77,78) [v2:127.0.0.1:6810/900443520,v1:127.0.0.1:6811/900443520] [v2:127.0.0.1:6812/900443520,v1:127.0.0.1:6813/900443520] exists,up 0649b5ed-f2a2-46d8-890a-68ec9ad1519c 2026-03-08T22:57:57.840 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:57:57.840 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:57:57.840 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:57:57.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:57:57.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:57:57.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:57:57.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:57:57.841 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:57:57.841 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:57:57.841 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:57:57.842 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:57:57.842 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:57:57.842 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:57:57.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:57:57.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:57:57.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:57:57.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:57:57.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:57:57.922 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:57:58.141 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:57:58.141 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:57:58.141 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:57:58.141 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:57:58.141 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:57:58.141 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:58.141 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:57:58.210 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803806 2026-03-08T22:57:58.211 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803806 2026-03-08T22:57:58.211 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803806' 2026-03-08T22:57:58.211 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:58.211 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:57:58.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=352187318275 2026-03-08T22:57:58.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 352187318275 2026-03-08T22:57:58.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803806 1-352187318275' 2026-03-08T22:57:58.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:58.286 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:57:58.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776138 2026-03-08T22:57:58.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776138 2026-03-08T22:57:58.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803806 1-352187318275 2-292057776138' 2026-03-08T22:57:58.356 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:58.356 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:57:58.433 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964117018 2026-03-08T22:57:58.433 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964117018 2026-03-08T22:57:58.433 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803806 1-352187318275 2-292057776138 3-115964117018' 2026-03-08T22:57:58.433 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:58.434 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803806 2026-03-08T22:57:58.434 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:58.435 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:57:58.436 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803806 2026-03-08T22:57:58.436 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:58.437 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803806 2026-03-08T22:57:58.437 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803806' 2026-03-08T22:57:58.437 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 25769803806 2026-03-08T22:57:58.438 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:58.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803804 -lt 25769803806 2026-03-08T22:57:58.662 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:57:59.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:57:59.664 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:59.881 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803804 -lt 25769803806 2026-03-08T22:57:59.881 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:58:00.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:58:00.883 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:01.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803807 -lt 25769803806 2026-03-08T22:58:01.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:01.104 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-352187318275 2026-03-08T22:58:01.104 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:01.105 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:58:01.105 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-352187318275 2026-03-08T22:58:01.105 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:01.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=352187318275 2026-03-08T22:58:01.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 352187318275' 2026-03-08T22:58:01.106 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 352187318275 2026-03-08T22:58:01.107 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:58:01.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 352187318275 -lt 352187318275 2026-03-08T22:58:01.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:01.327 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-292057776138 2026-03-08T22:58:01.327 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:01.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:58:01.328 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-292057776138 2026-03-08T22:58:01.328 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:01.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776138 2026-03-08T22:58:01.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 292057776138' 2026-03-08T22:58:01.329 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 292057776138 2026-03-08T22:58:01.329 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:58:01.552 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776138 -lt 292057776138 2026-03-08T22:58:01.552 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:01.553 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964117018 2026-03-08T22:58:01.553 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:01.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:58:01.555 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964117018 2026-03-08T22:58:01.555 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:01.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964117018 2026-03-08T22:58:01.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964117018' 2026-03-08T22:58:01.556 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 115964117018 2026-03-08T22:58:01.556 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:58:01.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964117018 -lt 115964117018 2026-03-08T22:58:01.784 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:58:01.785 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:58:01.785 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:58:02.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:58:02.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:58:02.068 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:58:02.068 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:58:02.068 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:58:02.068 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:58:02.068 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:58:02.069 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:58:02.276 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:58:02.276 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:58:02.276 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:58:02.276 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:58:02.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:58:02.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:58:02.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:58:02.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:217: set_size: rm -f td/test-erasure-eio/CORRUPT 2026-03-08T22:58:02.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:218: set_size: ceph osd unset noout 2026-03-08T22:58:02.878 INFO:tasks.workunit.client.0.vm04.stderr:noout is unset 2026-03-08T22:58:02.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:238: rados_get_data_bad_size: rados_get td/test-erasure-eio pool-jerasure obj-size-102080-1-256 2026-03-08T22:58:02.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:58:02.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:58:02.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=obj-size-102080-1-256 2026-03-08T22:58:02.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=ok 2026-03-08T22:58:02.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' ok = fail ']' 2026-03-08T22:58:02.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:120: rados_get: rados --pool pool-jerasure get obj-size-102080-1-256 td/test-erasure-eio/COPY 2026-03-08T22:58:02.916 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:121: rados_get: diff td/test-erasure-eio/ORIGINAL td/test-erasure-eio/COPY 2026-03-08T22:58:02.917 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:122: rados_get: rm td/test-erasure-eio/COPY 2026-03-08T22:58:02.918 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:241: rados_get_data_bad_size: expr 1 + 1 2026-03-08T22:58:02.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:241: rados_get_data_bad_size: shard_id=2 2026-03-08T22:58:02.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:242: rados_get_data_bad_size: set_size obj-size-102080-1-256 td/test-erasure-eio 2 256 add 2026-03-08T22:58:02.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:192: set_size: local objname=obj-size-102080-1-256 2026-03-08T22:58:02.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:193: set_size: shift 2026-03-08T22:58:02.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:194: set_size: local dir=td/test-erasure-eio 2026-03-08T22:58:02.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:195: set_size: shift 2026-03-08T22:58:02.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:196: set_size: local shard_id=2 2026-03-08T22:58:02.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:197: set_size: shift 2026-03-08T22:58:02.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:198: set_size: local bytes=256 2026-03-08T22:58:02.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:199: set_size: shift 2026-03-08T22:58:02.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:200: set_size: local mode=add 2026-03-08T22:58:02.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:202: set_size: local poolname=pool-jerasure 2026-03-08T22:58:02.919 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: get_osds pool-jerasure obj-size-102080-1-256 2026-03-08T22:58:02.919 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:58:02.919 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-size-102080-1-256 2026-03-08T22:58:02.920 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-size-102080-1-256 2026-03-08T22:58:02.920 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:58:03.138 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:58:03.138 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:58:03.138 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T22:58:03.138 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T22:58:03.139 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: initial_osds=('3' '1' '2') 2026-03-08T22:58:03.139 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:203: set_size: local -a initial_osds 2026-03-08T22:58:03.139 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:204: set_size: local osd_id=2 2026-03-08T22:58:03.139 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:205: set_size: ceph osd set noout 2026-03-08T22:58:03.449 INFO:tasks.workunit.client.0.vm04.stderr:noout is set 2026-03-08T22:58:03.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:206: set_size: '[' add = add ']' 2026-03-08T22:58:03.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:208: set_size: objectstore_tool td/test-erasure-eio 2 obj-size-102080-1-256 get-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:58:03.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-eio 2026-03-08T22:58:03.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:58:03.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=2 2026-03-08T22:58:03.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:58:03.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-eio 2 obj-size-102080-1-256 get-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:58:03.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-eio 2026-03-08T22:58:03.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:58:03.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=2 2026-03-08T22:58:03.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:58:03.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-eio TERM osd.2 2026-03-08T22:58:03.462 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:58:03.462 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:58:03.463 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:58:03.463 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:58:03.463 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:58:03.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:58:03.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-eio 2 obj-size-102080-1-256 get-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:58:03.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-eio 2026-03-08T22:58:03.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:58:03.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=2 2026-03-08T22:58:03.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:58:03.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-eio/2 2026-03-08T22:58:03.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-eio/2 obj-size-102080-1-256 get-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-eio 2 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:04.365 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:58:04.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:58:04.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:58:04.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:58:04.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:58:04.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:58:04.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:58:04.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:58:04.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:58:04.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:58:04.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:58:04.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T22:58:04.367 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T22:58:04.367 INFO:tasks.workunit.client.0.vm04.stderr:start osd.2 2026-03-08T22:58:04.367 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:58:04.367 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/2/whoami 2026-03-08T22:58:04.368 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T22:58:04.368 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:58:04.369 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:58:04.370 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:58:04.384 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:04.385+0000 7f9e73038780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:04.385 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:04.387+0000 7f9e73038780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:04.388 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:04.389+0000 7f9e73038780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:04.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T22:58:04.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:58:04.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:58:04.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:58:04.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:58:04.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:04.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:58:04.582 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:58:04.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:04.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:58:04.799 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:04.950 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:04.951+0000 7f9e73038780 -1 Falling back to public interface 2026-03-08T22:58:05.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:05.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:05.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:58:05.801 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:58:05.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:05.801 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:58:06.014 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:06.084 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:06.086+0000 7f9e73038780 -1 osd.2 86 log_to_monitors true 2026-03-08T22:58:06.882 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:06.883+0000 7f9e69fd0640 -1 osd.2 86 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:58:07.016 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:07.016 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:07.016 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:58:07.016 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:58:07.016 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:07.016 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:58:07.234 INFO:tasks.workunit.client.0.vm04.stderr:osd.2 up in weight 1 up_from 91 up_thru 79 down_at 88 last_clean_interval [68,86) [v2:127.0.0.1:6818/3216567903,v1:127.0.0.1:6819/3216567903] [v2:127.0.0.1:6820/3216567903,v1:127.0.0.1:6821/3216567903] exists,up ae882e4b-3141-437c-b90b-795070ff9b8d 2026-03-08T22:58:07.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:58:07.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:58:07.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:58:07.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:58:07.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:58:07.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:58:07.234 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:58:07.234 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:58:07.235 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:58:07.235 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:58:07.235 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:58:07.235 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:58:07.235 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:58:07.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:58:07.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:58:07.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:58:07.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:58:07.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:58:07.309 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:58:07.535 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:58:07.535 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:58:07.535 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:58:07.535 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:58:07.535 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:58:07.535 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:07.535 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:58:07.605 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803810 2026-03-08T22:58:07.605 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803810 2026-03-08T22:58:07.605 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803810' 2026-03-08T22:58:07.605 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:07.605 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:58:07.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=352187318279 2026-03-08T22:58:07.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 352187318279 2026-03-08T22:58:07.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803810 1-352187318279' 2026-03-08T22:58:07.678 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:07.679 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:58:07.754 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=390842023939 2026-03-08T22:58:07.754 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 390842023939 2026-03-08T22:58:07.754 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803810 1-352187318279 2-390842023939' 2026-03-08T22:58:07.754 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:07.754 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:58:07.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964117022 2026-03-08T22:58:07.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964117022 2026-03-08T22:58:07.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803810 1-352187318279 2-390842023939 3-115964117022' 2026-03-08T22:58:07.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:07.827 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803810 2026-03-08T22:58:07.827 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:07.828 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:58:07.829 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803810 2026-03-08T22:58:07.829 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:07.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803810 2026-03-08T22:58:07.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803810' 2026-03-08T22:58:07.830 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 25769803810 2026-03-08T22:58:07.830 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:08.042 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803808 -lt 25769803810 2026-03-08T22:58:08.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:58:09.044 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:58:09.044 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:09.252 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803810 -lt 25769803810 2026-03-08T22:58:09.252 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:09.252 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-352187318279 2026-03-08T22:58:09.252 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:09.253 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:58:09.253 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-352187318279 2026-03-08T22:58:09.254 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:09.254 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=352187318279 2026-03-08T22:58:09.254 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 352187318279' 2026-03-08T22:58:09.254 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 352187318279 2026-03-08T22:58:09.255 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:58:09.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 352187318279 -lt 352187318279 2026-03-08T22:58:09.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:09.466 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-390842023939 2026-03-08T22:58:09.466 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:09.467 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:58:09.467 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-390842023939 2026-03-08T22:58:09.467 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:09.468 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=390842023939 2026-03-08T22:58:09.468 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 390842023939' 2026-03-08T22:58:09.468 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 390842023939 2026-03-08T22:58:09.469 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:58:09.686 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 390842023939 -lt 390842023939 2026-03-08T22:58:09.686 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:09.686 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964117022 2026-03-08T22:58:09.687 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:09.688 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:58:09.688 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964117022 2026-03-08T22:58:09.689 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:09.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964117022 2026-03-08T22:58:09.690 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964117022' 2026-03-08T22:58:09.690 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 115964117022 2026-03-08T22:58:09.690 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:58:09.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964117022 -lt 115964117022 2026-03-08T22:58:09.905 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:58:09.905 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:58:09.905 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:58:10.191 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:58:10.191 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:58:10.191 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:58:10.191 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:58:10.191 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:58:10.191 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:58:10.191 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:58:10.191 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:58:10.404 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:58:10.405 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:58:10.405 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:58:10.405 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:58:10.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:58:10.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:58:10.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:58:10.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:209: set_size: dd if=/dev/urandom bs=256 count=1 2026-03-08T22:58:10.696 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records in 2026-03-08T22:58:10.696 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records out 2026-03-08T22:58:10.696 INFO:tasks.workunit.client.0.vm04.stderr:256 bytes copied, 8.2434e-05 s, 3.1 MB/s 2026-03-08T22:58:10.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:216: set_size: objectstore_tool td/test-erasure-eio 2 obj-size-102080-1-256 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:58:10.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-eio 2026-03-08T22:58:10.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:58:10.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=2 2026-03-08T22:58:10.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:58:10.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-eio 2 obj-size-102080-1-256 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:58:10.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-eio 2026-03-08T22:58:10.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:58:10.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=2 2026-03-08T22:58:10.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:58:10.697 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-eio TERM osd.2 2026-03-08T22:58:10.697 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:58:10.697 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:58:10.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:58:10.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:58:10.698 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:58:10.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:58:10.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-eio 2 obj-size-102080-1-256 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:58:10.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-eio 2026-03-08T22:58:10.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:58:10.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=2 2026-03-08T22:58:10.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:58:10.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-eio/2 2026-03-08T22:58:10.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-eio/2 obj-size-102080-1-256 set-bytes td/test-erasure-eio/CORRUPT 2026-03-08T22:58:11.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-eio 2 2026-03-08T22:58:11.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T22:58:11.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:58:11.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T22:58:11.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:58:11.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T22:58:11.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:58:11.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:58:11.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:58:11.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:58:11.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T22:58:11.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T22:58:11.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:58:11.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:58:11.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:58:11.932 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:58:11.932 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:58:11.932 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:58:11.932 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:58:11.932 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:11.932 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:11.932 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:58:11.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:58:11.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:58:11.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:58:11.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:58:11.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:58:11.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:58:11.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:58:11.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:58:11.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:58:11.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:58:11.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T22:58:11.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T22:58:11.934 INFO:tasks.workunit.client.0.vm04.stderr:start osd.2 2026-03-08T22:58:11.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:58:11.934 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/2/whoami 2026-03-08T22:58:11.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T22:58:11.935 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:58:11.936 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:58:11.938 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:58:11.956 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:11.957+0000 7f0e56a10780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:11.964 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:11.965+0000 7f0e56a10780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:11.965 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:11.966+0000 7f0e56a10780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:12.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T22:58:12.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:58:12.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:58:12.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:58:12.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:58:12.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:12.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:58:12.157 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T22:58:12.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:12.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:58:12.376 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:12.774 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:12.776+0000 7f0e56a10780 -1 Falling back to public interface 2026-03-08T22:58:13.378 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:13.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:13.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:58:13.379 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:58:13.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:13.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:58:13.600 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:13.648 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:13.650+0000 7f0e56a10780 -1 osd.2 92 log_to_monitors true 2026-03-08T22:58:14.603 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:14.603 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:14.603 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:58:14.603 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:58:14.603 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:14.603 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:58:14.838 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:15.071 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:15.073+0000 7f0e4d2cc640 -1 osd.2 92 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:58:15.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:15.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:15.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:58:15.841 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T22:58:15.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:15.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:58:16.062 INFO:tasks.workunit.client.0.vm04.stderr:osd.2 up in weight 1 up_from 96 up_thru 79 down_at 93 last_clean_interval [91,92) [v2:127.0.0.1:6818/1654112762,v1:127.0.0.1:6819/1654112762] [v2:127.0.0.1:6820/1654112762,v1:127.0.0.1:6821/1654112762] exists,up ae882e4b-3141-437c-b90b-795070ff9b8d 2026-03-08T22:58:16.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:58:16.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:58:16.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:58:16.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:58:16.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:58:16.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:58:16.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:58:16.062 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:58:16.063 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:58:16.063 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:58:16.063 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:58:16.063 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:58:16.063 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:58:16.135 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:58:16.135 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:58:16.135 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:58:16.135 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:58:16.135 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:58:16.135 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:58:16.359 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:58:16.359 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:58:16.359 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:58:16.359 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:58:16.359 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:58:16.359 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:16.360 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:58:16.443 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803814 2026-03-08T22:58:16.443 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803814 2026-03-08T22:58:16.443 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803814' 2026-03-08T22:58:16.443 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:16.443 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:58:16.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=352187318282 2026-03-08T22:58:16.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 352187318282 2026-03-08T22:58:16.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803814 1-352187318282' 2026-03-08T22:58:16.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:16.520 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:58:16.596 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=412316860419 2026-03-08T22:58:16.596 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 412316860419 2026-03-08T22:58:16.596 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803814 1-352187318282 2-412316860419' 2026-03-08T22:58:16.596 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:16.596 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:58:16.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964117025 2026-03-08T22:58:16.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964117025 2026-03-08T22:58:16.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803814 1-352187318282 2-412316860419 3-115964117025' 2026-03-08T22:58:16.669 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:16.670 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803814 2026-03-08T22:58:16.670 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:16.671 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:58:16.671 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803814 2026-03-08T22:58:16.671 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:16.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803814 2026-03-08T22:58:16.672 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803814' 2026-03-08T22:58:16.672 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 25769803814 2026-03-08T22:58:16.672 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:16.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803812 -lt 25769803814 2026-03-08T22:58:16.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:58:17.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:58:17.891 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:18.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803812 -lt 25769803814 2026-03-08T22:58:18.115 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:58:19.117 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:58:19.117 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:19.359 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803814 -lt 25769803814 2026-03-08T22:58:19.359 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:19.359 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-352187318282 2026-03-08T22:58:19.359 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:19.360 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:58:19.361 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-352187318282 2026-03-08T22:58:19.361 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:19.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=352187318282 2026-03-08T22:58:19.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 352187318282' 2026-03-08T22:58:19.362 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 352187318282 2026-03-08T22:58:19.363 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:58:19.581 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 352187318283 -lt 352187318282 2026-03-08T22:58:19.582 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:19.582 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-412316860419 2026-03-08T22:58:19.582 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:19.583 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:58:19.584 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-412316860419 2026-03-08T22:58:19.584 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:19.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=412316860419 2026-03-08T22:58:19.585 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 412316860419' 2026-03-08T22:58:19.585 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 412316860419 2026-03-08T22:58:19.586 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:58:19.807 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 412316860419 -lt 412316860419 2026-03-08T22:58:19.807 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:19.807 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964117025 2026-03-08T22:58:19.807 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:19.808 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:58:19.809 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964117025 2026-03-08T22:58:19.809 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:19.810 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964117025 2026-03-08T22:58:19.810 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964117025' 2026-03-08T22:58:19.810 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 115964117025 2026-03-08T22:58:19.811 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:58:20.030 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964117026 -lt 115964117025 2026-03-08T22:58:20.031 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:58:20.031 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:58:20.031 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:58:20.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:58:20.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:58:20.326 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:58:20.326 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:58:20.326 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:58:20.326 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:58:20.326 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:58:20.327 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:58:20.553 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:58:20.553 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:58:20.554 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:58:20.554 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:58:20.840 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:58:20.840 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:58:20.840 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:58:20.840 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:217: set_size: rm -f td/test-erasure-eio/CORRUPT 2026-03-08T22:58:20.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:218: set_size: ceph osd unset noout 2026-03-08T22:58:21.164 INFO:tasks.workunit.client.0.vm04.stderr:noout is unset 2026-03-08T22:58:21.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:243: rados_get_data_bad_size: rados_get td/test-erasure-eio pool-jerasure obj-size-102080-1-256 fail 2026-03-08T22:58:21.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:58:21.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:58:21.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=obj-size-102080-1-256 2026-03-08T22:58:21.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=fail 2026-03-08T22:58:21.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' fail = fail ']' 2026-03-08T22:58:21.175 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:114: rados_get: rados --pool pool-jerasure get obj-size-102080-1-256 td/test-erasure-eio/COPY 2026-03-08T22:58:21.201 INFO:tasks.workunit.client.0.vm04.stderr:error getting pool-jerasure/obj-size-102080-1-256: (5) Input/output error 2026-03-08T22:58:21.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:115: rados_get: return 2026-03-08T22:58:21.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:244: rados_get_data_bad_size: rm td/test-erasure-eio/ORIGINAL 2026-03-08T22:58:21.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:326: TEST_rados_get_bad_size_shard_1: delete_erasure_coded_pool pool-jerasure 2026-03-08T22:58:21.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:84: delete_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:58:21.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:85: delete_erasure_coded_pool: ceph osd pool delete pool-jerasure pool-jerasure --yes-i-really-really-mean-it 2026-03-08T22:58:21.427 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' does not exist 2026-03-08T22:58:21.437 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:86: delete_erasure_coded_pool: ceph osd erasure-code-profile rm myprofile 2026-03-08T22:58:21.713 INFO:tasks.workunit.client.0.vm04.stderr:erasure-code-profile myprofile does not exist 2026-03-08T22:58:21.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:42: run: teardown td/test-erasure-eio 2026-03-08T22:58:21.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:58:21.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:58:21.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:58:21.724 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:58:21.724 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:58:21.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:58:21.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:58:21.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:58:21.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:58:21.851 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:58:21.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:58:21.853 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:58:21.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:58:21.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:58:21.854 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:58:21.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:58:21.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:58:21.856 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:58:21.856 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:58:21.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:58:21.858 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:58:21.858 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:58:21.882 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:58:21.882 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:21.882 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:21.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:58:21.884 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:58:21.884 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:58:21.885 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:32: run: for func in $funcs 2026-03-08T22:58:21.885 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:33: run: setup td/test-erasure-eio 2026-03-08T22:58:21.885 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-eio 2026-03-08T22:58:21.885 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-eio 2026-03-08T22:58:21.885 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:58:21.885 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:58:21.885 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:58:21.885 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:58:21.885 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:58:21.885 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:58:21.885 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:58:21.885 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:58:21.888 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:58:21.888 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:58:21.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:58:21.889 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:58:21.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:58:21.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:58:21.891 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:58:21.892 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:58:21.892 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:58:21.892 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:58:21.892 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:58:21.893 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:58:21.894 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:58:21.894 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:58:21.895 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:58:21.895 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:21.896 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:21.896 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:58:21.897 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:58:21.897 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:58:21.897 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-eio 2026-03-08T22:58:21.898 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:58:21.898 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:21.898 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:21.898 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.102080 2026-03-08T22:58:21.899 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:58:21.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:58:21.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:58:21.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-eio 1' TERM HUP INT 2026-03-08T22:58:21.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:34: run: run_mon td/test-erasure-eio a 2026-03-08T22:58:21.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-eio 2026-03-08T22:58:21.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:58:21.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:58:21.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:58:21.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-eio/a 2026-03-08T22:58:21.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-eio/a --run-dir=td/test-erasure-eio 2026-03-08T22:58:21.927 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:58:21.927 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:58:21.927 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:58:21.927 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:58:21.928 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:21.928 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:21.928 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:58:21.928 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-eio/a '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-eio/log --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:58:21.961 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:58:21.961 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:58:21.961 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:58:21.961 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:58:21.961 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:58:21.961 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:58:21.963 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:58:21.963 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:58:21.963 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:58:21.964 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:58:21.964 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:21.965 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:21.965 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:58:21.966 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:58:21.966 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get fsid 2026-03-08T22:58:22.016 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:58:22.017 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:58:22.017 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:58:22.017 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:58:22.017 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:58:22.017 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:58:22.017 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:58:22.017 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:58:22.018 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:58:22.018 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:22.018 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:22.018 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:58:22.018 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:58:22.018 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get mon_host 2026-03-08T22:58:22.072 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:35: run: run_mgr td/test-erasure-eio x 2026-03-08T22:58:22.072 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-eio 2026-03-08T22:58:22.072 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:58:22.072 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:58:22.072 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:58:22.072 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-eio/x 2026-03-08T22:58:22.072 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:58:22.198 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:58:22.198 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:58:22.198 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:58:22.198 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:58:22.198 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:22.198 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:22.199 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:58:22.199 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:58:22.200 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-eio/x '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:58:22.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:36: run: create_pool rbd 4 2026-03-08T22:58:22.221 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:58:22.339 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd' already exists 2026-03-08T22:58:22.349 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:58:23.350 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: get_asok_path mon.a 2026-03-08T22:58:23.350 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:58:23.350 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:58:23.351 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:58:23.351 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:23.351 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:23.351 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:58:23.351 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: CEPH_ARGS= 2026-03-08T22:58:23.351 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-mon.a.asok log flush 2026-03-08T22:58:23.395 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:58:23.401 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:40: run: grep 'load: jerasure.*lrc' td/test-erasure-eio/mon.a.log 2026-03-08T22:58:23.402 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:58:21.952+0000 7f9f525ddd80 0 load: jerasure load: lrc 2026-03-08T22:58:23.402 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:41: run: TEST_rados_get_subread_eio_shard_0 td/test-erasure-eio 2026-03-08T22:58:23.402 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:255: TEST_rados_get_subread_eio_shard_0: local dir=td/test-erasure-eio 2026-03-08T22:58:23.402 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:256: TEST_rados_get_subread_eio_shard_0: setup_osds 4 2026-03-08T22:58:23.402 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:47: setup_osds: local count=4 2026-03-08T22:58:23.402 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:48: setup_osds: shift 2026-03-08T22:58:23.403 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: expr 4 - 1 2026-03-08T22:58:23.404 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: seq 0 3 2026-03-08T22:58:23.404 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:58:23.404 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 0 2026-03-08T22:58:23.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:58:23.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:58:23.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:58:23.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:58:23.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/0 2026-03-08T22:58:23.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:58:23.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:58:23.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:58:23.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:58:23.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/0' 2026-03-08T22:58:23.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/0/journal' 2026-03-08T22:58:23.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:58:23.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:58:23.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:58:23.405 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:58:23.405 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:58:23.405 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:58:23.405 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:58:23.405 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:23.405 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:23.406 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:58:23.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:58:23.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:58:23.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:58:23.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:58:23.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:58:23.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:58:23.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:58:23.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:58:23.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:58:23.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:58:23.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:58:23.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:58:23.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/0 2026-03-08T22:58:23.407 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:58:23.408 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=25b63eef-275a-429d-b495-8a2fed632d2b 2026-03-08T22:58:23.408 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 25b63eef-275a-429d-b495-8a2fed632d2b' 2026-03-08T22:58:23.408 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 25b63eef-275a-429d-b495-8a2fed632d2b 2026-03-08T22:58:23.408 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:58:23.421 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAP/61pTY02GRAA4lk1sD0PS80oqXBhnoQtjw== 2026-03-08T22:58:23.421 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAP/61pTY02GRAA4lk1sD0PS80oqXBhnoQtjw=="}' 2026-03-08T22:58:23.421 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 25b63eef-275a-429d-b495-8a2fed632d2b -i td/test-erasure-eio/0/new.json 2026-03-08T22:58:23.554 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:58:23.563 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/0/new.json 2026-03-08T22:58:23.564 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAP/61pTY02GRAA4lk1sD0PS80oqXBhnoQtjw== --osd-uuid 25b63eef-275a-429d-b495-8a2fed632d2b 2026-03-08T22:58:23.586 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:23.586+0000 7fb557a0f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:23.586 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:23.589+0000 7fb557a0f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:23.591 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:23.593+0000 7fb557a0f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:23.591 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:23.593+0000 7fb557a0f780 -1 bdev(0x56506c5a4800 td/test-erasure-eio/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:58:23.591 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:23.593+0000 7fb557a0f780 -1 bluestore(td/test-erasure-eio/0) _read_fsid unparsable uuid 2026-03-08T22:58:25.880 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/0/keyring 2026-03-08T22:58:25.880 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:58:25.881 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:58:25.881 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:58:25.881 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:58:26.190 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:58:26.190 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:58:26.190 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:58:26.191 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:58:26.191 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:58:26.193 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:58:26.209 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:26.211+0000 7fc3a9438780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:26.216 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:26.219+0000 7fc3a9438780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:26.219 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:26.220+0000 7fc3a9438780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:26.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:58:26.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:58:26.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:58:26.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:58:26.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:58:26.426 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:58:26.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:26.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:58:26.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:26.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:58:26.653 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:27.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:27.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:27.655 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:58:27.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:58:27.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:27.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:58:27.785 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:27.787+0000 7fc3a9438780 -1 Falling back to public interface 2026-03-08T22:58:27.871 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:28.656 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:28.659+0000 7fc3a9438780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:58:28.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:28.873 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:28.873 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:58:28.873 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:58:28.874 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:28.874 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:58:29.122 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:29.656 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:29.658+0000 7fc3a4bd9640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:58:30.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:30.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:30.123 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:58:30.123 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:58:30.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:30.124 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:58:30.344 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 6 up_thru 6 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3578238739,v1:127.0.0.1:6803/3578238739] [v2:127.0.0.1:6804/3578238739,v1:127.0.0.1:6805/3578238739] exists,up 25b63eef-275a-429d-b495-8a2fed632d2b 2026-03-08T22:58:30.344 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:58:30.344 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:58:30.344 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:58:30.344 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:58:30.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 1 2026-03-08T22:58:30.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:58:30.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:58:30.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:58:30.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:58:30.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T22:58:30.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:58:30.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:58:30.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:58:30.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:58:30.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T22:58:30.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T22:58:30.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:58:30.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:58:30.345 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:58:30.345 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:58:30.345 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:58:30.345 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:58:30.345 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:58:30.346 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:30.346 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:30.346 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:58:30.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:58:30.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:58:30.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:58:30.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:58:30.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:58:30.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:58:30.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:58:30.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:58:30.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:58:30.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:58:30.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:58:30.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:58:30.346 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T22:58:30.348 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:58:30.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=714d122e-7f3e-455b-ad5c-406829982311 2026-03-08T22:58:30.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 714d122e-7f3e-455b-ad5c-406829982311' 2026-03-08T22:58:30.349 INFO:tasks.workunit.client.0.vm04.stdout:add osd1 714d122e-7f3e-455b-ad5c-406829982311 2026-03-08T22:58:30.349 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:58:30.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAW/61p5iCsFRAAuO+w8EmdkQTDTdZqXjjKqQ== 2026-03-08T22:58:30.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAW/61p5iCsFRAAuO+w8EmdkQTDTdZqXjjKqQ=="}' 2026-03-08T22:58:30.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 714d122e-7f3e-455b-ad5c-406829982311 -i td/test-erasure-eio/1/new.json 2026-03-08T22:58:30.584 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:58:30.594 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/1/new.json 2026-03-08T22:58:30.595 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAW/61p5iCsFRAAuO+w8EmdkQTDTdZqXjjKqQ== --osd-uuid 714d122e-7f3e-455b-ad5c-406829982311 2026-03-08T22:58:30.615 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:30.616+0000 7f619e23f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:30.617 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:30.619+0000 7f619e23f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:30.618 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:30.620+0000 7f619e23f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:30.619 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:30.621+0000 7f619e23f780 -1 bdev(0x559c0655dc00 td/test-erasure-eio/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:58:30.619 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:30.621+0000 7f619e23f780 -1 bluestore(td/test-erasure-eio/1) _read_fsid unparsable uuid 2026-03-08T22:58:32.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/1/keyring 2026-03-08T22:58:32.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:58:32.752 INFO:tasks.workunit.client.0.vm04.stdout:adding osd1 key to auth repository 2026-03-08T22:58:32.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:58:32.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:58:33.076 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:58:33.076 INFO:tasks.workunit.client.0.vm04.stdout:start osd.1 2026-03-08T22:58:33.076 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:58:33.076 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:58:33.077 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:58:33.079 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:58:33.094 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:33.096+0000 7f47fb21e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:33.102 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:33.104+0000 7f47fb21e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:33.103 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:33.105+0000 7f47fb21e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:33.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:58:33.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:58:33.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:58:33.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:58:33.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:58:33.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:33.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:58:33.301 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:58:33.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:33.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:58:33.526 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:34.188 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:34.190+0000 7f47fb21e780 -1 Falling back to public interface 2026-03-08T22:58:34.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:34.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:34.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:58:34.527 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:58:34.528 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:34.528 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:58:34.758 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:35.019 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:35.021+0000 7f47fb21e780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:58:35.760 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:58:35.761 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:35.761 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:35.761 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:58:35.761 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:35.761 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:58:36.003 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:37.005 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:58:37.005 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:37.005 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:37.006 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:58:37.006 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:37.006 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:58:37.383 INFO:tasks.workunit.client.0.vm04.stdout:osd.1 up in weight 1 up_from 13 up_thru 13 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/203646256,v1:127.0.0.1:6811/203646256] [v2:127.0.0.1:6812/203646256,v1:127.0.0.1:6813/203646256] exists,up 714d122e-7f3e-455b-ad5c-406829982311 2026-03-08T22:58:37.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:58:37.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:58:37.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:58:37.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:58:37.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 2 2026-03-08T22:58:37.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:58:37.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:58:37.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:58:37.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:58:37.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T22:58:37.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:58:37.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:58:37.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:58:37.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:58:37.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T22:58:37.384 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T22:58:37.384 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:58:37.384 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:58:37.384 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:58:37.384 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:58:37.384 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:58:37.384 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:58:37.384 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:58:37.384 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:37.384 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:37.385 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:58:37.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:58:37.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:58:37.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:58:37.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:58:37.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:58:37.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:58:37.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:58:37.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:58:37.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:58:37.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:58:37.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:58:37.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:58:37.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T22:58:37.387 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:58:37.388 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=03dbfcd7-59ec-4f74-ac15-2d80661dfd04 2026-03-08T22:58:37.388 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 03dbfcd7-59ec-4f74-ac15-2d80661dfd04' 2026-03-08T22:58:37.388 INFO:tasks.workunit.client.0.vm04.stdout:add osd2 03dbfcd7-59ec-4f74-ac15-2d80661dfd04 2026-03-08T22:58:37.388 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:58:37.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAd/61pP8wbGBAAvIypS79nMV4Vs0pxUoZmGQ== 2026-03-08T22:58:37.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAd/61pP8wbGBAAvIypS79nMV4Vs0pxUoZmGQ=="}' 2026-03-08T22:58:37.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 03dbfcd7-59ec-4f74-ac15-2d80661dfd04 -i td/test-erasure-eio/2/new.json 2026-03-08T22:58:37.718 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:58:37.729 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/2/new.json 2026-03-08T22:58:37.731 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAd/61pP8wbGBAAvIypS79nMV4Vs0pxUoZmGQ== --osd-uuid 03dbfcd7-59ec-4f74-ac15-2d80661dfd04 2026-03-08T22:58:37.749 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:37.751+0000 7f646b52e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:37.769 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:37.772+0000 7f646b52e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:37.780 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:37.773+0000 7f646b52e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:37.780 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:37.773+0000 7f646b52e780 -1 bdev(0x55cc35eb9c00 td/test-erasure-eio/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:58:37.780 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:37.773+0000 7f646b52e780 -1 bluestore(td/test-erasure-eio/2) _read_fsid unparsable uuid 2026-03-08T22:58:40.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/2/keyring 2026-03-08T22:58:40.655 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:58:40.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:58:40.656 INFO:tasks.workunit.client.0.vm04.stdout:adding osd2 key to auth repository 2026-03-08T22:58:40.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:58:40.946 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:58:40.947 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T22:58:40.947 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:58:40.947 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:58:40.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:58:40.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:58:40.964 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:40.966+0000 7fda95790780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:40.967 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:40.969+0000 7fda95790780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:40.968 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:40.970+0000 7fda95790780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:41.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:58:41.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:58:41.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:58:41.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:58:41.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:58:41.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:41.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:58:41.163 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:58:41.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:41.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:58:41.373 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:42.037 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:42.039+0000 7fda95790780 -1 Falling back to public interface 2026-03-08T22:58:42.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:42.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:42.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:58:42.374 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:58:42.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:42.375 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:58:42.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:42.882 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:42.885+0000 7fda95790780 -1 osd.2 0 log_to_monitors true 2026-03-08T22:58:43.601 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:58:43.601 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:43.601 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:43.601 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:58:43.601 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:43.601 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:58:43.827 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:44.829 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:58:44.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:44.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:44.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:58:44.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:44.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:58:45.037 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up in weight 1 up_from 19 up_thru 19 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/3255743805,v1:127.0.0.1:6819/3255743805] [v2:127.0.0.1:6820/3255743805,v1:127.0.0.1:6821/3255743805] exists,up 03dbfcd7-59ec-4f74-ac15-2d80661dfd04 2026-03-08T22:58:45.037 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:58:45.037 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:58:45.037 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 3 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/3 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/3' 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/3/journal' 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:45.038 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:45.039 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:58:45.039 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:58:45.039 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:58:45.039 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:58:45.039 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:58:45.039 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:58:45.039 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:58:45.039 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:58:45.039 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:58:45.039 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:58:45.039 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:58:45.039 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:58:45.039 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:58:45.039 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/3 2026-03-08T22:58:45.040 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:58:45.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=950b5c7a-3dd9-43e3-8491-b91632b7e6a7 2026-03-08T22:58:45.041 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 950b5c7a-3dd9-43e3-8491-b91632b7e6a7' 2026-03-08T22:58:45.041 INFO:tasks.workunit.client.0.vm04.stdout:add osd3 950b5c7a-3dd9-43e3-8491-b91632b7e6a7 2026-03-08T22:58:45.041 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:58:45.054 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAl/61puZhTAxAA/aHTMKjxDHOya/a1131svQ== 2026-03-08T22:58:45.054 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAl/61puZhTAxAA/aHTMKjxDHOya/a1131svQ=="}' 2026-03-08T22:58:45.054 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 950b5c7a-3dd9-43e3-8491-b91632b7e6a7 -i td/test-erasure-eio/3/new.json 2026-03-08T22:58:45.273 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:58:45.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/3/new.json 2026-03-08T22:58:45.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAl/61puZhTAxAA/aHTMKjxDHOya/a1131svQ== --osd-uuid 950b5c7a-3dd9-43e3-8491-b91632b7e6a7 2026-03-08T22:58:45.303 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:45.305+0000 7fdf5a590780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:45.305 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:45.307+0000 7fdf5a590780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:45.306 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:45.308+0000 7fdf5a590780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:45.307 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:45.309+0000 7fdf5a590780 -1 bdev(0x55c35ddb3c00 td/test-erasure-eio/3/block) open stat got: (1) Operation not permitted 2026-03-08T22:58:45.307 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:45.309+0000 7fdf5a590780 -1 bluestore(td/test-erasure-eio/3) _read_fsid unparsable uuid 2026-03-08T22:58:47.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/3/keyring 2026-03-08T22:58:47.438 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:58:47.439 INFO:tasks.workunit.client.0.vm04.stdout:adding osd3 key to auth repository 2026-03-08T22:58:47.439 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T22:58:47.439 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:58:47.718 INFO:tasks.workunit.client.0.vm04.stdout:start osd.3 2026-03-08T22:58:47.718 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T22:58:47.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:58:47.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:58:47.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:58:47.722 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:58:47.737 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:47.738+0000 7f11aa50e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:47.738 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:47.741+0000 7f11aa50e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:47.740 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:47.742+0000 7f11aa50e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:47.939 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T22:58:47.939 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:58:47.940 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:58:47.940 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:58:47.940 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:58:47.940 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:47.940 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:58:47.940 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:58:47.940 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:47.940 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:58:48.142 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:48.310 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:48.312+0000 7f11aa50e780 -1 Falling back to public interface 2026-03-08T22:58:49.145 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:58:49.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:49.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:49.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:58:49.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:49.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:58:49.184 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:49.186+0000 7f11aa50e780 -1 osd.3 0 log_to_monitors true 2026-03-08T22:58:49.377 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:50.203 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:58:50.206+0000 7f11a5cad640 -1 osd.3 0 waiting for initial osdmap 2026-03-08T22:58:50.379 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:58:50.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:50.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:50.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:58:50.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:50.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:58:50.590 INFO:tasks.workunit.client.0.vm04.stdout:osd.3 up in weight 1 up_from 27 up_thru 28 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/792242120,v1:127.0.0.1:6827/792242120] [v2:127.0.0.1:6828/792242120,v1:127.0.0.1:6829/792242120] exists,up 950b5c7a-3dd9-43e3-8491-b91632b7e6a7 2026-03-08T22:58:50.590 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:58:50.590 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:58:50.591 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:58:50.591 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: get_asok_path osd.0 2026-03-08T22:58:50.591 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:58:50.591 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:58:50.591 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:58:50.591 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:50.591 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:50.591 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.0.asok 2026-03-08T22:58:50.592 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: CEPH_ARGS= 2026-03-08T22:58:50.592 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.0.asok log flush 2026-03-08T22:58:50.638 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:58:50.643 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:56: setup_osds: grep 'load: jerasure.*lrc' td/test-erasure-eio/osd.0.log 2026-03-08T22:58:50.645 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:58:27.791+0000 7fc3a9438780 0 load: jerasure load: lrc 2026-03-08T22:58:50.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:258: TEST_rados_get_subread_eio_shard_0: local poolname=pool-jerasure 2026-03-08T22:58:50.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:259: TEST_rados_get_subread_eio_shard_0: create_erasure_coded_pool pool-jerasure 2 1 2026-03-08T22:58:50.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:67: create_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:58:50.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:68: create_erasure_coded_pool: shift 2026-03-08T22:58:50.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:69: create_erasure_coded_pool: local k=2 2026-03-08T22:58:50.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:70: create_erasure_coded_pool: shift 2026-03-08T22:58:50.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:71: create_erasure_coded_pool: local m=1 2026-03-08T22:58:50.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:72: create_erasure_coded_pool: shift 2026-03-08T22:58:50.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:74: create_erasure_coded_pool: ceph osd erasure-code-profile set myprofile plugin=jerasure k=2 m=1 crush-failure-domain=osd 2026-03-08T22:58:50.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:78: create_erasure_coded_pool: create_pool pool-jerasure 1 1 erasure myprofile 2026-03-08T22:58:50.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create pool-jerasure 1 1 erasure myprofile 2026-03-08T22:58:51.287 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' already exists 2026-03-08T22:58:51.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:58:52.297 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:80: create_erasure_coded_pool: wait_for_clean 2026-03-08T22:58:52.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:58:52.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:58:52.298 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:58:52.298 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:58:52.298 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:58:52.298 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:58:52.298 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:58:52.298 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:58:52.298 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:58:52.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:58:52.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:58:52.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:58:52.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:58:52.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:58:52.366 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:58:52.570 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:58:52.570 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:58:52.570 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:58:52.570 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:58:52.570 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:58:52.570 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:52.570 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:58:52.641 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803783 2026-03-08T22:58:52.641 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803783 2026-03-08T22:58:52.641 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783' 2026-03-08T22:58:52.641 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:52.641 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:58:52.711 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574854 2026-03-08T22:58:52.711 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574854 2026-03-08T22:58:52.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783 1-55834574854' 2026-03-08T22:58:52.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:52.712 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:58:52.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378628 2026-03-08T22:58:52.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378628 2026-03-08T22:58:52.780 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783 1-55834574854 2-81604378628' 2026-03-08T22:58:52.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:52.781 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:58:52.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964116995 2026-03-08T22:58:52.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964116995 2026-03-08T22:58:52.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783 1-55834574854 2-81604378628 3-115964116995' 2026-03-08T22:58:52.850 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:52.850 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803783 2026-03-08T22:58:52.850 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:52.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:58:52.851 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803783 2026-03-08T22:58:52.851 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:52.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803783 2026-03-08T22:58:52.852 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803783 2026-03-08T22:58:52.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803783' 2026-03-08T22:58:52.852 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:53.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803781 -lt 25769803783 2026-03-08T22:58:53.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:58:54.056 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:58:54.056 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:54.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803781 -lt 25769803783 2026-03-08T22:58:54.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:58:55.287 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:58:55.288 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:55.499 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803783 -lt 25769803783 2026-03-08T22:58:55.499 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:55.499 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574854 2026-03-08T22:58:55.499 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:55.500 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:58:55.500 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574854 2026-03-08T22:58:55.501 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:55.501 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574854 2026-03-08T22:58:55.501 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574854' 2026-03-08T22:58:55.501 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 55834574854 2026-03-08T22:58:55.502 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:58:55.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574854 -lt 55834574854 2026-03-08T22:58:55.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:55.708 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378628 2026-03-08T22:58:55.708 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:55.709 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:58:55.709 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378628 2026-03-08T22:58:55.709 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:55.710 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378628 2026-03-08T22:58:55.710 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 81604378628 2026-03-08T22:58:55.710 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378628' 2026-03-08T22:58:55.710 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:58:55.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378629 -lt 81604378628 2026-03-08T22:58:55.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:55.925 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964116995 2026-03-08T22:58:55.925 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:55.926 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:58:55.926 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964116995 2026-03-08T22:58:55.927 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:55.927 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964116995 2026-03-08T22:58:55.928 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 115964116995 2026-03-08T22:58:55.928 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964116995' 2026-03-08T22:58:55.928 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:58:56.134 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964116995 -lt 115964116995 2026-03-08T22:58:56.134 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:58:56.134 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:58:56.135 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:58:56.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:58:56.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:58:56.416 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:58:56.416 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:58:56.416 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:58:56.416 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:58:56.416 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:58:56.417 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:58:56.621 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:58:56.621 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:58:56.621 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:58:56.621 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:58:56.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:58:56.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:58:56.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:58:56.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:261: TEST_rados_get_subread_eio_shard_0: local shard_id=0 2026-03-08T22:58:56.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:262: TEST_rados_get_subread_eio_shard_0: rados_put_get_data eio td/test-erasure-eio 0 2026-03-08T22:58:56.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:147: rados_put_get_data: local inject=eio 2026-03-08T22:58:56.909 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:148: rados_put_get_data: shift 2026-03-08T22:58:56.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:149: rados_put_get_data: local dir=td/test-erasure-eio 2026-03-08T22:58:56.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:150: rados_put_get_data: shift 2026-03-08T22:58:56.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:151: rados_put_get_data: local shard_id=0 2026-03-08T22:58:56.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:152: rados_put_get_data: shift 2026-03-08T22:58:56.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:153: rados_put_get_data: local arg= 2026-03-08T22:58:56.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:157: rados_put_get_data: local poolname=pool-jerasure 2026-03-08T22:58:56.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:158: rados_put_get_data: local objname=obj-eio-102080-0 2026-03-08T22:58:56.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:159: rados_put_get_data: rados_put td/test-erasure-eio pool-jerasure obj-eio-102080-0 2026-03-08T22:58:56.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T22:58:56.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T22:58:56.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=obj-eio-102080-0 2026-03-08T22:58:56.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:58:56.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T22:58:56.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:58:56.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T22:58:56.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:58:56.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T22:58:56.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:58:56.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T22:58:56.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put obj-eio-102080-0 td/test-erasure-eio/ORIGINAL 2026-03-08T22:58:56.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:160: rados_put_get_data: inject_eio ec data pool-jerasure obj-eio-102080-0 td/test-erasure-eio 0 2026-03-08T22:58:56.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=ec 2026-03-08T22:58:56.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T22:58:56.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=data 2026-03-08T22:58:56.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T22:58:56.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=pool-jerasure 2026-03-08T22:58:56.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T22:58:56.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=obj-eio-102080-0 2026-03-08T22:58:56.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T22:58:56.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/test-erasure-eio 2026-03-08T22:58:56.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T22:58:56.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=0 2026-03-08T22:58:56.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T22:58:56.977 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds pool-jerasure obj-eio-102080-0 2026-03-08T22:58:56.977 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:58:56.977 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-eio-102080-0 2026-03-08T22:58:56.977 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-eio-102080-0 2026-03-08T22:58:56.978 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:58:57.196 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:58:57.196 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:58:57.196 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T22:58:57.196 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T22:58:57.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('3' '1' '2') 2026-03-08T22:58:57.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T22:58:57.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=3 2026-03-08T22:58:57.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' ec '!=' ec ']' 2026-03-08T22:58:57.196 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/test-erasure-eio/3/type 2026-03-08T22:58:57.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T22:58:57.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 3 bluestore_debug_inject_read_err true 2026-03-08T22:58:57.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T22:58:57.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=3 2026-03-08T22:58:57.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T22:58:57.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T22:58:57.198 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.3 2026-03-08T22:58:57.198 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.3 2026-03-08T22:58:57.198 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.3 ']' 2026-03-08T22:58:57.198 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T22:58:57.198 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:58:57.198 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:57.198 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:57.198 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.3.asok 2026-03-08T22:58:57.198 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.102080/ceph-osd.3.asok config set bluestore_debug_inject_read_err true 2026-03-08T22:58:57.249 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T22:58:57.249 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T22:58:57.249 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T22:58:57.249 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.3 2026-03-08T22:58:57.249 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.3 2026-03-08T22:58:57.249 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.3 ']' 2026-03-08T22:58:57.249 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:58:57.249 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:57.249 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:57.250 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.3.asok 2026-03-08T22:58:57.250 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T22:58:57.250 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.3.asok injectdataerr pool-jerasure obj-eio-102080-0 0 2026-03-08T22:58:57.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:161: rados_put_get_data: rados_get td/test-erasure-eio pool-jerasure obj-eio-102080-0 2026-03-08T22:58:57.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:58:57.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:58:57.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=obj-eio-102080-0 2026-03-08T22:58:57.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=ok 2026-03-08T22:58:57.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' ok = fail ']' 2026-03-08T22:58:57.302 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:120: rados_get: rados --pool pool-jerasure get obj-eio-102080-0 td/test-erasure-eio/COPY 2026-03-08T22:58:57.323 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:121: rados_get: diff td/test-erasure-eio/ORIGINAL td/test-erasure-eio/COPY 2026-03-08T22:58:57.324 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:122: rados_get: rm td/test-erasure-eio/COPY 2026-03-08T22:58:57.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:163: rados_put_get_data: '[' '' = recovery ']' 2026-03-08T22:58:57.325 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:181: rados_put_get_data: expr 0 + 1 2026-03-08T22:58:57.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:181: rados_put_get_data: shard_id=1 2026-03-08T22:58:57.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:182: rados_put_get_data: inject_eio ec data pool-jerasure obj-eio-102080-0 td/test-erasure-eio 1 2026-03-08T22:58:57.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=ec 2026-03-08T22:58:57.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T22:58:57.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=data 2026-03-08T22:58:57.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T22:58:57.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=pool-jerasure 2026-03-08T22:58:57.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T22:58:57.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=obj-eio-102080-0 2026-03-08T22:58:57.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T22:58:57.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/test-erasure-eio 2026-03-08T22:58:57.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T22:58:57.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=1 2026-03-08T22:58:57.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T22:58:57.327 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds pool-jerasure obj-eio-102080-0 2026-03-08T22:58:57.327 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:58:57.327 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-eio-102080-0 2026-03-08T22:58:57.327 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-eio-102080-0 2026-03-08T22:58:57.327 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:58:57.554 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:58:57.554 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:58:57.554 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T22:58:57.555 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T22:58:57.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('3' '1' '2') 2026-03-08T22:58:57.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T22:58:57.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=1 2026-03-08T22:58:57.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' ec '!=' ec ']' 2026-03-08T22:58:57.555 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/test-erasure-eio/1/type 2026-03-08T22:58:57.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T22:58:57.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 1 bluestore_debug_inject_read_err true 2026-03-08T22:58:57.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T22:58:57.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=1 2026-03-08T22:58:57.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T22:58:57.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T22:58:57.556 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T22:58:57.556 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.1 2026-03-08T22:58:57.557 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:58:57.557 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:58:57.557 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:58:57.557 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:57.557 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:57.557 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.1.asok 2026-03-08T22:58:57.557 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.102080/ceph-osd.1.asok config set bluestore_debug_inject_read_err true 2026-03-08T22:58:57.610 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T22:58:57.610 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T22:58:57.610 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T22:58:57.611 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.1 2026-03-08T22:58:57.611 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:58:57.611 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:58:57.611 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:58:57.611 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:57.611 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:57.611 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.1.asok 2026-03-08T22:58:57.611 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T22:58:57.611 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.1.asok injectdataerr pool-jerasure obj-eio-102080-0 1 2026-03-08T22:58:57.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:183: rados_put_get_data: rados_get td/test-erasure-eio pool-jerasure obj-eio-102080-0 fail 2026-03-08T22:58:57.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:58:57.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:58:57.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=obj-eio-102080-0 2026-03-08T22:58:57.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=fail 2026-03-08T22:58:57.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' fail = fail ']' 2026-03-08T22:58:57.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:114: rados_get: rados --pool pool-jerasure get obj-eio-102080-0 td/test-erasure-eio/COPY 2026-03-08T22:58:57.721 INFO:tasks.workunit.client.0.vm04.stderr:error getting pool-jerasure/obj-eio-102080-0: (5) Input/output error 2026-03-08T22:58:57.723 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:115: rados_get: return 2026-03-08T22:58:57.723 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:184: rados_put_get_data: rm td/test-erasure-eio/ORIGINAL 2026-03-08T22:58:57.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:263: TEST_rados_get_subread_eio_shard_0: delete_erasure_coded_pool pool-jerasure 2026-03-08T22:58:57.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:84: delete_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:58:57.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:85: delete_erasure_coded_pool: ceph osd pool delete pool-jerasure pool-jerasure --yes-i-really-really-mean-it 2026-03-08T22:58:57.968 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' does not exist 2026-03-08T22:58:57.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:86: delete_erasure_coded_pool: ceph osd erasure-code-profile rm myprofile 2026-03-08T22:58:58.281 INFO:tasks.workunit.client.0.vm04.stderr:erasure-code-profile myprofile does not exist 2026-03-08T22:58:58.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:42: run: teardown td/test-erasure-eio 2026-03-08T22:58:58.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:58:58.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:58:58.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:58:58.290 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:58:58.290 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:58:58.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:58:58.290 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:58:58.291 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:58:58.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:58:58.405 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:58:58.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:58:58.407 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:58:58.407 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:58:58.408 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:58:58.408 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:58:58.408 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:58:58.408 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:58:58.409 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:58:58.410 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:58:58.410 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:58:58.426 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:58:58.426 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:58.426 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:58.426 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:58:58.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:58:58.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:58:58.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:32: run: for func in $funcs 2026-03-08T22:58:58.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:33: run: setup td/test-erasure-eio 2026-03-08T22:58:58.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-eio 2026-03-08T22:58:58.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-eio 2026-03-08T22:58:58.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:58:58.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:58:58.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:58:58.428 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:58:58.428 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:58:58.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:58:58.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:58:58.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:58:58.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:58:58.430 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:58:58.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:58:58.431 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:58:58.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:58:58.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:58:58.432 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:58:58.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:58:58.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:58:58.433 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:58:58.433 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:58:58.433 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:58:58.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:58:58.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:58:58.435 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:58:58.435 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:58.435 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:58.435 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:58:58.436 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:58:58.436 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:58:58.436 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-eio 2026-03-08T22:58:58.437 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:58:58.437 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:58.437 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:58.437 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.102080 2026-03-08T22:58:58.438 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:58:58.439 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:58:58.439 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:58:58.439 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-eio 1' TERM HUP INT 2026-03-08T22:58:58.439 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:34: run: run_mon td/test-erasure-eio a 2026-03-08T22:58:58.439 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-eio 2026-03-08T22:58:58.439 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:58:58.439 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:58:58.439 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:58:58.439 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-eio/a 2026-03-08T22:58:58.439 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-eio/a --run-dir=td/test-erasure-eio 2026-03-08T22:58:58.463 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:58:58.463 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:58:58.463 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:58:58.463 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:58:58.463 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:58.463 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:58.463 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:58:58.463 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-eio/a '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-eio/log --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:58:58.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:58:58.493 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:58:58.493 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:58:58.493 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:58:58.493 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:58:58.494 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:58:58.494 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:58:58.494 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:58:58.494 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:58:58.494 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:58:58.495 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:58.495 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:58.495 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:58:58.495 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:58:58.495 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get fsid 2026-03-08T22:58:58.541 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:58:58.541 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:58:58.541 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:58:58.541 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:58:58.542 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:58:58.542 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:58:58.542 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:58:58.542 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:58:58.542 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:58:58.542 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:58.542 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:58.542 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:58:58.543 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:58:58.543 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get mon_host 2026-03-08T22:58:58.590 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:35: run: run_mgr td/test-erasure-eio x 2026-03-08T22:58:58.590 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-eio 2026-03-08T22:58:58.590 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:58:58.590 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:58:58.591 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:58:58.591 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-eio/x 2026-03-08T22:58:58.591 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:58:58.702 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:58:58.703 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:58:58.703 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:58:58.703 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:58:58.703 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:58.703 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:58.703 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:58:58.703 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:58:58.704 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-eio/x '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:58:58.724 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:36: run: create_pool rbd 4 2026-03-08T22:58:58.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:58:58.836 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd' already exists 2026-03-08T22:58:58.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:58:59.849 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: get_asok_path mon.a 2026-03-08T22:58:59.849 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:58:59.849 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:58:59.849 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:58:59.849 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:59.849 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:59.849 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:58:59.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: CEPH_ARGS= 2026-03-08T22:58:59.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-mon.a.asok log flush 2026-03-08T22:58:59.893 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:58:59.899 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:40: run: grep 'load: jerasure.*lrc' td/test-erasure-eio/mon.a.log 2026-03-08T22:58:59.900 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:58:58.486+0000 7f2065532d80 0 load: jerasure load: lrc 2026-03-08T22:58:59.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:41: run: TEST_rados_get_subread_eio_shard_1 td/test-erasure-eio 2026-03-08T22:58:59.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:267: TEST_rados_get_subread_eio_shard_1: local dir=td/test-erasure-eio 2026-03-08T22:58:59.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:268: TEST_rados_get_subread_eio_shard_1: setup_osds 4 2026-03-08T22:58:59.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:47: setup_osds: local count=4 2026-03-08T22:58:59.900 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:48: setup_osds: shift 2026-03-08T22:58:59.901 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: expr 4 - 1 2026-03-08T22:58:59.901 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: seq 0 3 2026-03-08T22:58:59.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:58:59.902 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 0 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/0 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/0' 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/0/journal' 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:58:59.903 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:58:59.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:58:59.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:58:59.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:58:59.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:58:59.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:58:59.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:58:59.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:58:59.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:58:59.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:58:59.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:58:59.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:58:59.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:58:59.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/0 2026-03-08T22:58:59.905 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:58:59.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=0aaadddd-8314-4198-b3e1-1a23f5b583c1 2026-03-08T22:58:59.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 0aaadddd-8314-4198-b3e1-1a23f5b583c1' 2026-03-08T22:58:59.906 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 0aaadddd-8314-4198-b3e1-1a23f5b583c1 2026-03-08T22:58:59.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:58:59.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAz/61ptCDfNhAAPPnwxhDMEBVLk7khshR08A== 2026-03-08T22:58:59.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAz/61ptCDfNhAAPPnwxhDMEBVLk7khshR08A=="}' 2026-03-08T22:58:59.919 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 0aaadddd-8314-4198-b3e1-1a23f5b583c1 -i td/test-erasure-eio/0/new.json 2026-03-08T22:59:00.033 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:59:00.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/0/new.json 2026-03-08T22:59:00.046 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAz/61ptCDfNhAAPPnwxhDMEBVLk7khshR08A== --osd-uuid 0aaadddd-8314-4198-b3e1-1a23f5b583c1 2026-03-08T22:59:00.068 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:00.068+0000 7f4f3b65f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:00.068 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:00.071+0000 7f4f3b65f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:00.070 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:00.072+0000 7f4f3b65f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:00.071 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:00.073+0000 7f4f3b65f780 -1 bdev(0x5614cb6c0800 td/test-erasure-eio/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:59:00.071 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:00.073+0000 7f4f3b65f780 -1 bluestore(td/test-erasure-eio/0) _read_fsid unparsable uuid 2026-03-08T22:59:02.721 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/0/keyring 2026-03-08T22:59:02.721 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:59:02.722 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:59:02.722 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:59:02.722 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:59:03.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:59:03.012 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:59:03.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:59:03.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:59:03.013 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:59:03.015 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:59:03.030 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:03.031+0000 7f6a4c1ad780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:03.031 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:03.033+0000 7f6a4c1ad780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:03.032 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:03.034+0000 7f6a4c1ad780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:03.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:59:03.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:59:03.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:59:03.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:59:03.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:59:03.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:03.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:59:03.231 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:59:03.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:03.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:59:03.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:04.104 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:04.106+0000 7f6a4c1ad780 -1 Falling back to public interface 2026-03-08T22:59:04.460 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:04.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:04.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:59:04.461 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:59:04.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:04.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:59:04.681 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:04.945 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:04.947+0000 7f6a4c1ad780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:59:05.684 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:59:05.684 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:05.684 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:05.684 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:59:05.684 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:05.684 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:59:05.920 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:06.041 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:06.043+0000 7f6a4794e640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:59:06.922 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:59:06.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:06.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:06.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:59:06.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:59:06.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:07.145 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 6 up_thru 6 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3895008083,v1:127.0.0.1:6803/3895008083] [v2:127.0.0.1:6804/3895008083,v1:127.0.0.1:6805/3895008083] exists,up 0aaadddd-8314-4198-b3e1-1a23f5b583c1 2026-03-08T22:59:07.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:59:07.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:59:07.145 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:59:07.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:59:07.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 1 2026-03-08T22:59:07.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:59:07.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:59:07.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:59:07.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:59:07.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T22:59:07.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:59:07.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:59:07.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:59:07.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:59:07.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T22:59:07.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T22:59:07.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:59:07.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:59:07.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:59:07.147 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:59:07.147 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:59:07.147 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:59:07.147 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:59:07.147 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:07.147 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:59:07.147 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:59:07.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:59:07.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:59:07.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:59:07.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:59:07.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:59:07.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:59:07.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:59:07.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:59:07.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:59:07.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:59:07.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:59:07.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:59:07.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T22:59:07.149 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:59:07.150 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=015c383a-c91a-4d84-b4a9-e17fb582479a 2026-03-08T22:59:07.150 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 015c383a-c91a-4d84-b4a9-e17fb582479a' 2026-03-08T22:59:07.150 INFO:tasks.workunit.client.0.vm04.stdout:add osd1 015c383a-c91a-4d84-b4a9-e17fb582479a 2026-03-08T22:59:07.151 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:59:07.164 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQA7/61pbPXbCRAATKSyE+dCjMBIzVo8hDUQTA== 2026-03-08T22:59:07.164 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQA7/61pbPXbCRAATKSyE+dCjMBIzVo8hDUQTA=="}' 2026-03-08T22:59:07.164 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 015c383a-c91a-4d84-b4a9-e17fb582479a -i td/test-erasure-eio/1/new.json 2026-03-08T22:59:07.375 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:59:07.385 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/1/new.json 2026-03-08T22:59:07.386 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQA7/61pbPXbCRAATKSyE+dCjMBIzVo8hDUQTA== --osd-uuid 015c383a-c91a-4d84-b4a9-e17fb582479a 2026-03-08T22:59:07.405 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:07.407+0000 7f904f13f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:07.407 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:07.409+0000 7f904f13f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:07.408 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:07.410+0000 7f904f13f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:07.408 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:07.411+0000 7f904f13f780 -1 bdev(0x5647df429c00 td/test-erasure-eio/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:59:07.408 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:07.411+0000 7f904f13f780 -1 bluestore(td/test-erasure-eio/1) _read_fsid unparsable uuid 2026-03-08T22:59:09.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/1/keyring 2026-03-08T22:59:09.551 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:59:09.552 INFO:tasks.workunit.client.0.vm04.stdout:adding osd1 key to auth repository 2026-03-08T22:59:09.552 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:59:09.552 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:59:09.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:59:09.847 INFO:tasks.workunit.client.0.vm04.stdout:start osd.1 2026-03-08T22:59:09.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:59:09.847 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:59:09.848 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:59:09.849 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:59:09.865 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:09.867+0000 7f3e7960d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:09.871 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:09.873+0000 7f3e7960d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:09.872 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:09.875+0000 7f3e7960d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:10.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:59:10.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:59:10.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:59:10.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:59:10.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:59:10.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:10.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:59:10.070 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:59:10.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:10.070 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:59:10.289 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:10.951 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:10.953+0000 7f3e7960d780 -1 Falling back to public interface 2026-03-08T22:59:11.291 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:11.291 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:11.291 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:59:11.291 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:59:11.291 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:11.291 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:59:11.511 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:11.805 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:11.807+0000 7f3e7960d780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:59:12.513 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:59:12.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:12.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:12.514 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:59:12.514 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:12.514 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:59:12.764 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:12.960 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:12.962+0000 7f3e75028640 -1 osd.1 0 waiting for initial osdmap 2026-03-08T22:59:13.767 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:59:13.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:13.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:13.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:59:13.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:13.767 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:59:13.993 INFO:tasks.workunit.client.0.vm04.stdout:osd.1 up in weight 1 up_from 13 up_thru 13 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/1174203385,v1:127.0.0.1:6811/1174203385] [v2:127.0.0.1:6812/1174203385,v1:127.0.0.1:6813/1174203385] exists,up 015c383a-c91a-4d84-b4a9-e17fb582479a 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 2 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:13.994 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:59:13.995 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:59:13.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:59:13.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:59:13.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:59:13.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:59:13.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:59:13.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:59:13.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:59:13.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:59:13.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:59:13.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:59:13.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:59:13.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:59:13.995 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T22:59:13.996 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:59:13.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=8a70d3e4-1e23-4f41-8f05-729b8f047a94 2026-03-08T22:59:13.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 8a70d3e4-1e23-4f41-8f05-729b8f047a94' 2026-03-08T22:59:13.997 INFO:tasks.workunit.client.0.vm04.stdout:add osd2 8a70d3e4-1e23-4f41-8f05-729b8f047a94 2026-03-08T22:59:13.997 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:59:14.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBC/61pple6ABAAu0nxrEj583t2MWM7Q26YUg== 2026-03-08T22:59:14.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBC/61pple6ABAAu0nxrEj583t2MWM7Q26YUg=="}' 2026-03-08T22:59:14.011 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 8a70d3e4-1e23-4f41-8f05-729b8f047a94 -i td/test-erasure-eio/2/new.json 2026-03-08T22:59:14.236 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:59:14.245 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/2/new.json 2026-03-08T22:59:14.246 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBC/61pple6ABAAu0nxrEj583t2MWM7Q26YUg== --osd-uuid 8a70d3e4-1e23-4f41-8f05-729b8f047a94 2026-03-08T22:59:14.266 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:14.268+0000 7fe0eb153780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:14.268 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:14.270+0000 7fe0eb153780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:14.269 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:14.271+0000 7fe0eb153780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:14.269 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:14.271+0000 7fe0eb153780 -1 bdev(0x558be85dbc00 td/test-erasure-eio/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:59:14.269 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:14.271+0000 7fe0eb153780 -1 bluestore(td/test-erasure-eio/2) _read_fsid unparsable uuid 2026-03-08T22:59:16.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/2/keyring 2026-03-08T22:59:16.397 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:59:16.398 INFO:tasks.workunit.client.0.vm04.stdout:adding osd2 key to auth repository 2026-03-08T22:59:16.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:59:16.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:59:16.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:59:16.712 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T22:59:16.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:59:16.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:59:16.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:59:16.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:59:16.731 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:16.732+0000 7f57655ad780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:16.740 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:16.742+0000 7f57655ad780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:16.742 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:16.743+0000 7f57655ad780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:16.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:59:16.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:59:16.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:59:16.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:59:16.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:59:16.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:16.944 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:59:16.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:59:16.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:16.944 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:59:17.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:17.569 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:17.571+0000 7f57655ad780 -1 Falling back to public interface 2026-03-08T22:59:18.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:18.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:18.169 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:59:18.169 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:59:18.169 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:18.170 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:59:18.403 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:18.667 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:18.669+0000 7f57655ad780 -1 osd.2 0 log_to_monitors true 2026-03-08T22:59:19.405 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:59:19.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:19.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:19.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:59:19.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:59:19.405 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:19.632 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:20.633 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:20.633 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:20.633 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:59:20.633 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:59:20.634 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:20.634 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:59:20.861 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up in weight 1 up_from 19 up_thru 19 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/175300900,v1:127.0.0.1:6819/175300900] [v2:127.0.0.1:6820/175300900,v1:127.0.0.1:6821/175300900] exists,up 8a70d3e4-1e23-4f41-8f05-729b8f047a94 2026-03-08T22:59:20.861 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 3 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/3 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/3' 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/3/journal' 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:59:20.862 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:59:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:59:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:59:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:59:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:59:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:59:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:59:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:59:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:59:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:59:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:59:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:59:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:59:20.863 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/3 2026-03-08T22:59:20.864 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:59:20.865 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=400b48a9-9563-4519-b9ba-48d671b3447e 2026-03-08T22:59:20.865 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 400b48a9-9563-4519-b9ba-48d671b3447e' 2026-03-08T22:59:20.865 INFO:tasks.workunit.client.0.vm04.stdout:add osd3 400b48a9-9563-4519-b9ba-48d671b3447e 2026-03-08T22:59:20.865 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:59:20.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBI/61pJdF6NBAAVJFDHuWRsC+uGwMDHfN2Hg== 2026-03-08T22:59:20.879 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBI/61pJdF6NBAAVJFDHuWRsC+uGwMDHfN2Hg=="}' 2026-03-08T22:59:20.880 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 400b48a9-9563-4519-b9ba-48d671b3447e -i td/test-erasure-eio/3/new.json 2026-03-08T22:59:21.119 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:59:21.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/3/new.json 2026-03-08T22:59:21.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBI/61pJdF6NBAAVJFDHuWRsC+uGwMDHfN2Hg== --osd-uuid 400b48a9-9563-4519-b9ba-48d671b3447e 2026-03-08T22:59:21.150 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:21.152+0000 7f75e04b2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:21.152 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:21.154+0000 7f75e04b2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:21.153 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:21.155+0000 7f75e04b2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:21.154 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:21.156+0000 7f75e04b2780 -1 bdev(0x558e0a559c00 td/test-erasure-eio/3/block) open stat got: (1) Operation not permitted 2026-03-08T22:59:21.154 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:21.156+0000 7f75e04b2780 -1 bluestore(td/test-erasure-eio/3) _read_fsid unparsable uuid 2026-03-08T22:59:23.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/3/keyring 2026-03-08T22:59:23.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:59:23.791 INFO:tasks.workunit.client.0.vm04.stdout:adding osd3 key to auth repository 2026-03-08T22:59:23.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T22:59:23.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:59:24.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T22:59:24.092 INFO:tasks.workunit.client.0.vm04.stdout:start osd.3 2026-03-08T22:59:24.092 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:59:24.093 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:59:24.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:59:24.095 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:59:24.112 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:24.113+0000 7ff515010780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:24.113 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:24.115+0000 7ff515010780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:24.115 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:24.116+0000 7ff515010780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:24.315 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T22:59:24.315 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:59:24.315 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:59:24.315 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:59:24.315 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:59:24.315 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:24.315 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:59:24.316 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:59:24.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:24.316 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:59:24.541 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:25.182 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:25.184+0000 7ff515010780 -1 Falling back to public interface 2026-03-08T22:59:25.543 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:59:25.543 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:25.543 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:25.543 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:59:25.543 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:25.543 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:59:25.755 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:26.281 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:26.283+0000 7ff515010780 -1 osd.3 0 log_to_monitors true 2026-03-08T22:59:26.758 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:59:26.758 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:26.758 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:26.758 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:59:26.758 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:26.758 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:59:27.015 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:27.352 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:27.354+0000 7ff5100d5640 -1 osd.3 0 waiting for initial osdmap 2026-03-08T22:59:28.017 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:59:28.017 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:28.017 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:28.017 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:59:28.017 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:28.017 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:59:28.237 INFO:tasks.workunit.client.0.vm04.stdout:osd.3 up in weight 1 up_from 27 up_thru 28 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/2688090542,v1:127.0.0.1:6827/2688090542] [v2:127.0.0.1:6828/2688090542,v1:127.0.0.1:6829/2688090542] exists,up 400b48a9-9563-4519-b9ba-48d671b3447e 2026-03-08T22:59:28.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:59:28.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:59:28.237 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:59:28.238 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: get_asok_path osd.0 2026-03-08T22:59:28.238 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:59:28.238 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:59:28.238 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:59:28.238 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:28.238 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:59:28.238 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.0.asok 2026-03-08T22:59:28.238 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: CEPH_ARGS= 2026-03-08T22:59:28.239 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.0.asok log flush 2026-03-08T22:59:28.289 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:59:28.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:56: setup_osds: grep 'load: jerasure.*lrc' td/test-erasure-eio/osd.0.log 2026-03-08T22:59:28.296 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:59:04.109+0000 7f6a4c1ad780 0 load: jerasure load: lrc 2026-03-08T22:59:28.297 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:270: TEST_rados_get_subread_eio_shard_1: local poolname=pool-jerasure 2026-03-08T22:59:28.297 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:271: TEST_rados_get_subread_eio_shard_1: create_erasure_coded_pool pool-jerasure 2 1 2026-03-08T22:59:28.297 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:67: create_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:59:28.297 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:68: create_erasure_coded_pool: shift 2026-03-08T22:59:28.297 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:69: create_erasure_coded_pool: local k=2 2026-03-08T22:59:28.297 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:70: create_erasure_coded_pool: shift 2026-03-08T22:59:28.297 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:71: create_erasure_coded_pool: local m=1 2026-03-08T22:59:28.297 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:72: create_erasure_coded_pool: shift 2026-03-08T22:59:28.297 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:74: create_erasure_coded_pool: ceph osd erasure-code-profile set myprofile plugin=jerasure k=2 m=1 crush-failure-domain=osd 2026-03-08T22:59:28.610 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:78: create_erasure_coded_pool: create_pool pool-jerasure 1 1 erasure myprofile 2026-03-08T22:59:28.610 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create pool-jerasure 1 1 erasure myprofile 2026-03-08T22:59:28.976 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' already exists 2026-03-08T22:59:28.987 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:59:29.988 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:80: create_erasure_coded_pool: wait_for_clean 2026-03-08T22:59:29.988 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:59:29.988 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:59:29.988 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:59:29.989 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:59:29.989 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:59:29.989 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:59:29.989 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:59:29.989 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:59:29.989 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:59:30.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:59:30.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:59:30.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:59:30.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:59:30.074 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:59:30.075 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:59:30.312 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:59:30.312 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:59:30.312 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T22:59:30.312 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T22:59:30.312 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:59:30.312 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:30.312 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:59:30.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803783 2026-03-08T22:59:30.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803783 2026-03-08T22:59:30.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783' 2026-03-08T22:59:30.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:30.391 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:59:30.467 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574854 2026-03-08T22:59:30.467 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574854 2026-03-08T22:59:30.467 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783 1-55834574854' 2026-03-08T22:59:30.467 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:30.467 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:59:30.548 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378629 2026-03-08T22:59:30.548 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378629 2026-03-08T22:59:30.548 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783 1-55834574854 2-81604378629' 2026-03-08T22:59:30.548 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:30.548 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:59:30.627 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964116995 2026-03-08T22:59:30.627 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964116995 2026-03-08T22:59:30.627 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783 1-55834574854 2-81604378629 3-115964116995' 2026-03-08T22:59:30.627 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:30.627 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803783 2026-03-08T22:59:30.627 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:30.628 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:59:30.629 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803783 2026-03-08T22:59:30.629 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:30.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803783 2026-03-08T22:59:30.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803783' 2026-03-08T22:59:30.630 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803783 2026-03-08T22:59:30.631 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:59:30.871 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803781 -lt 25769803783 2026-03-08T22:59:30.871 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:59:31.872 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:59:31.872 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:59:32.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803783 -lt 25769803783 2026-03-08T22:59:32.100 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:32.100 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574854 2026-03-08T22:59:32.100 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:32.101 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:59:32.102 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574854 2026-03-08T22:59:32.102 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:32.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574854 2026-03-08T22:59:32.103 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574854' 2026-03-08T22:59:32.103 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 55834574854 2026-03-08T22:59:32.103 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:59:32.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574854 -lt 55834574854 2026-03-08T22:59:32.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:32.328 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378629 2026-03-08T22:59:32.328 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:32.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:59:32.330 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378629 2026-03-08T22:59:32.330 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:32.331 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378629 2026-03-08T22:59:32.331 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378629' 2026-03-08T22:59:32.331 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 81604378629 2026-03-08T22:59:32.331 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:59:32.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378629 -lt 81604378629 2026-03-08T22:59:32.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:32.556 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964116995 2026-03-08T22:59:32.556 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:32.557 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:59:32.558 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964116995 2026-03-08T22:59:32.558 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:32.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964116995 2026-03-08T22:59:32.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964116995' 2026-03-08T22:59:32.559 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 115964116995 2026-03-08T22:59:32.559 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:59:32.786 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964116995 -lt 115964116995 2026-03-08T22:59:32.787 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:59:32.787 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:59:32.787 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:59:33.109 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:59:33.109 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:59:33.109 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:59:33.109 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:59:33.109 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:59:33.110 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:59:33.110 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:59:33.110 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:59:33.371 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:59:33.371 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:59:33.371 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:59:33.372 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:273: TEST_rados_get_subread_eio_shard_1: local shard_id=1 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:274: TEST_rados_get_subread_eio_shard_1: rados_put_get_data eio td/test-erasure-eio 1 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:147: rados_put_get_data: local inject=eio 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:148: rados_put_get_data: shift 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:149: rados_put_get_data: local dir=td/test-erasure-eio 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:150: rados_put_get_data: shift 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:151: rados_put_get_data: local shard_id=1 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:152: rados_put_get_data: shift 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:153: rados_put_get_data: local arg= 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:157: rados_put_get_data: local poolname=pool-jerasure 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:158: rados_put_get_data: local objname=obj-eio-102080-1 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:159: rados_put_get_data: rados_put td/test-erasure-eio pool-jerasure obj-eio-102080-1 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=obj-eio-102080-1 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T22:59:33.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put obj-eio-102080-1 td/test-erasure-eio/ORIGINAL 2026-03-08T22:59:33.831 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:160: rados_put_get_data: inject_eio ec data pool-jerasure obj-eio-102080-1 td/test-erasure-eio 1 2026-03-08T22:59:33.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=ec 2026-03-08T22:59:33.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T22:59:33.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=data 2026-03-08T22:59:33.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T22:59:33.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=pool-jerasure 2026-03-08T22:59:33.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T22:59:33.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=obj-eio-102080-1 2026-03-08T22:59:33.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T22:59:33.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/test-erasure-eio 2026-03-08T22:59:33.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T22:59:33.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=1 2026-03-08T22:59:33.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T22:59:33.832 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds pool-jerasure obj-eio-102080-1 2026-03-08T22:59:33.832 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:59:33.832 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-eio-102080-1 2026-03-08T22:59:33.832 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-eio-102080-1 2026-03-08T22:59:33.832 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:59:34.060 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:59:34.060 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:59:34.060 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T22:59:34.060 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T22:59:34.060 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('3' '1' '2') 2026-03-08T22:59:34.061 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T22:59:34.061 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=1 2026-03-08T22:59:34.061 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' ec '!=' ec ']' 2026-03-08T22:59:34.061 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/test-erasure-eio/1/type 2026-03-08T22:59:34.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T22:59:34.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 1 bluestore_debug_inject_read_err true 2026-03-08T22:59:34.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T22:59:34.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=1 2026-03-08T22:59:34.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T22:59:34.062 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T22:59:34.063 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T22:59:34.063 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.1 2026-03-08T22:59:34.063 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:59:34.063 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:59:34.063 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:59:34.064 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:34.064 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:59:34.064 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.1.asok 2026-03-08T22:59:34.064 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.102080/ceph-osd.1.asok config set bluestore_debug_inject_read_err true 2026-03-08T22:59:34.118 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T22:59:34.118 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T22:59:34.118 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T22:59:34.118 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.1 2026-03-08T22:59:34.118 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:59:34.118 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:59:34.119 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:59:34.119 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:34.119 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:59:34.119 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.1.asok 2026-03-08T22:59:34.119 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T22:59:34.119 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.1.asok injectdataerr pool-jerasure obj-eio-102080-1 1 2026-03-08T22:59:34.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:161: rados_put_get_data: rados_get td/test-erasure-eio pool-jerasure obj-eio-102080-1 2026-03-08T22:59:34.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:59:34.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:59:34.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=obj-eio-102080-1 2026-03-08T22:59:34.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=ok 2026-03-08T22:59:34.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' ok = fail ']' 2026-03-08T22:59:34.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:120: rados_get: rados --pool pool-jerasure get obj-eio-102080-1 td/test-erasure-eio/COPY 2026-03-08T22:59:34.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:121: rados_get: diff td/test-erasure-eio/ORIGINAL td/test-erasure-eio/COPY 2026-03-08T22:59:34.206 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:122: rados_get: rm td/test-erasure-eio/COPY 2026-03-08T22:59:34.207 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:163: rados_put_get_data: '[' '' = recovery ']' 2026-03-08T22:59:34.207 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:181: rados_put_get_data: expr 1 + 1 2026-03-08T22:59:34.208 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:181: rados_put_get_data: shard_id=2 2026-03-08T22:59:34.208 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:182: rados_put_get_data: inject_eio ec data pool-jerasure obj-eio-102080-1 td/test-erasure-eio 2 2026-03-08T22:59:34.208 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=ec 2026-03-08T22:59:34.208 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T22:59:34.208 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=data 2026-03-08T22:59:34.208 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T22:59:34.209 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=pool-jerasure 2026-03-08T22:59:34.209 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T22:59:34.209 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=obj-eio-102080-1 2026-03-08T22:59:34.209 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T22:59:34.209 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/test-erasure-eio 2026-03-08T22:59:34.209 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T22:59:34.209 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=2 2026-03-08T22:59:34.209 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T22:59:34.209 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds pool-jerasure obj-eio-102080-1 2026-03-08T22:59:34.209 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T22:59:34.209 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-eio-102080-1 2026-03-08T22:59:34.210 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-eio-102080-1 2026-03-08T22:59:34.210 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:59:34.435 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:59:34.435 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T22:59:34.435 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T22:59:34.435 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T22:59:34.435 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('3' '1' '2') 2026-03-08T22:59:34.435 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T22:59:34.435 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=2 2026-03-08T22:59:34.435 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' ec '!=' ec ']' 2026-03-08T22:59:34.435 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/test-erasure-eio/2/type 2026-03-08T22:59:34.436 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T22:59:34.436 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 2 bluestore_debug_inject_read_err true 2026-03-08T22:59:34.436 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T22:59:34.436 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=2 2026-03-08T22:59:34.436 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T22:59:34.436 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T22:59:34.437 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T22:59:34.437 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.2 2026-03-08T22:59:34.437 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.2 2026-03-08T22:59:34.437 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.2 ']' 2026-03-08T22:59:34.437 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:59:34.437 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:34.437 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:59:34.437 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.2.asok 2026-03-08T22:59:34.437 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.102080/ceph-osd.2.asok config set bluestore_debug_inject_read_err true 2026-03-08T22:59:34.491 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T22:59:34.491 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T22:59:34.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T22:59:34.492 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.2 2026-03-08T22:59:34.492 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.2 2026-03-08T22:59:34.492 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.2 ']' 2026-03-08T22:59:34.492 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:59:34.492 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:34.492 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:59:34.492 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.2.asok 2026-03-08T22:59:34.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T22:59:34.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.2.asok injectdataerr pool-jerasure obj-eio-102080-1 2 2026-03-08T22:59:34.548 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:183: rados_put_get_data: rados_get td/test-erasure-eio pool-jerasure obj-eio-102080-1 fail 2026-03-08T22:59:34.548 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T22:59:34.548 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T22:59:34.548 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=obj-eio-102080-1 2026-03-08T22:59:34.549 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=fail 2026-03-08T22:59:34.549 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' fail = fail ']' 2026-03-08T22:59:34.549 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:114: rados_get: rados --pool pool-jerasure get obj-eio-102080-1 td/test-erasure-eio/COPY 2026-03-08T22:59:34.572 INFO:tasks.workunit.client.0.vm04.stderr:error getting pool-jerasure/obj-eio-102080-1: (5) Input/output error 2026-03-08T22:59:34.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:115: rados_get: return 2026-03-08T22:59:34.574 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:184: rados_put_get_data: rm td/test-erasure-eio/ORIGINAL 2026-03-08T22:59:34.575 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:275: TEST_rados_get_subread_eio_shard_1: delete_erasure_coded_pool pool-jerasure 2026-03-08T22:59:34.575 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:84: delete_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T22:59:34.575 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:85: delete_erasure_coded_pool: ceph osd pool delete pool-jerasure pool-jerasure --yes-i-really-really-mean-it 2026-03-08T22:59:34.880 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' does not exist 2026-03-08T22:59:34.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:86: delete_erasure_coded_pool: ceph osd erasure-code-profile rm myprofile 2026-03-08T22:59:35.155 INFO:tasks.workunit.client.0.vm04.stderr:erasure-code-profile myprofile does not exist 2026-03-08T22:59:35.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:42: run: teardown td/test-erasure-eio 2026-03-08T22:59:35.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:59:35.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:59:35.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:59:35.168 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:59:35.168 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:59:35.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:59:35.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:59:35.168 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:59:35.300 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:59:35.300 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:59:35.301 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:59:35.302 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:59:35.303 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:59:35.303 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:59:35.303 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:59:35.304 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:59:35.304 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:59:35.304 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:59:35.304 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:59:35.305 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:59:35.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:59:35.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:59:35.323 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:59:35.323 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:35.323 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:59:35.323 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:59:35.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:59:35.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:59:35.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:32: run: for func in $funcs 2026-03-08T22:59:35.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:33: run: setup td/test-erasure-eio 2026-03-08T22:59:35.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-eio 2026-03-08T22:59:35.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-eio 2026-03-08T22:59:35.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T22:59:35.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:59:35.325 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T22:59:35.325 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:59:35.325 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:59:35.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:59:35.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:59:35.326 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:59:35.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:59:35.328 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:59:35.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:59:35.329 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:59:35.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:59:35.329 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:59:35.330 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:59:35.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:59:35.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:59:35.331 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:59:35.331 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:59:35.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:59:35.333 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:59:35.333 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T22:59:35.334 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:59:35.334 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:35.334 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:59:35.334 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T22:59:35.335 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:59:35.335 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:59:35.335 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-eio 2026-03-08T22:59:35.337 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:59:35.337 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:35.337 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:59:35.338 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.102080 2026-03-08T22:59:35.339 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:59:35.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:59:35.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:59:35.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-eio 1' TERM HUP INT 2026-03-08T22:59:35.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:34: run: run_mon td/test-erasure-eio a 2026-03-08T22:59:35.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-eio 2026-03-08T22:59:35.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:59:35.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:59:35.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:59:35.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-eio/a 2026-03-08T22:59:35.340 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-eio/a --run-dir=td/test-erasure-eio 2026-03-08T22:59:35.385 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:59:35.385 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:59:35.385 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:59:35.385 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:59:35.385 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:35.385 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:59:35.386 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:59:35.386 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-eio/a '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-eio/log --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:59:35.423 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:59:35.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:59:35.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:59:35.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:59:35.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:59:35.423 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:59:35.424 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:59:35.424 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:59:35.424 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:59:35.425 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:59:35.425 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:35.425 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:59:35.426 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:59:35.426 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:59:35.426 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get fsid 2026-03-08T22:59:35.483 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:59:35.483 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:59:35.483 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:59:35.483 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:59:35.483 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:59:35.483 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:59:35.483 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:59:35.483 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:59:35.483 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:59:35.483 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:35.483 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:59:35.483 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:59:35.484 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:59:35.484 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get mon_host 2026-03-08T22:59:35.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:35: run: run_mgr td/test-erasure-eio x 2026-03-08T22:59:35.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-eio 2026-03-08T22:59:35.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:59:35.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:59:35.540 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:59:35.541 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-eio/x 2026-03-08T22:59:35.541 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:59:35.663 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:59:35.663 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:59:35.663 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:59:35.664 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:59:35.664 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:35.664 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:59:35.664 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:59:35.664 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:59:35.665 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-eio/x '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:59:35.684 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:36: run: create_pool rbd 4 2026-03-08T22:59:35.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:59:35.814 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd' already exists 2026-03-08T22:59:35.826 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:59:36.828 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: get_asok_path mon.a 2026-03-08T22:59:36.828 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:59:36.829 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:59:36.829 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:59:36.829 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:36.829 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:59:36.830 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T22:59:36.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: CEPH_ARGS= 2026-03-08T22:59:36.831 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-mon.a.asok log flush 2026-03-08T22:59:36.878 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T22:59:36.885 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:40: run: grep 'load: jerasure.*lrc' td/test-erasure-eio/mon.a.log 2026-03-08T22:59:36.886 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:59:35.410+0000 7f3395bded80 0 load: jerasure load: lrc 2026-03-08T22:59:36.886 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:41: run: TEST_rados_get_subread_missing td/test-erasure-eio 2026-03-08T22:59:36.886 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:282: TEST_rados_get_subread_missing: local dir=td/test-erasure-eio 2026-03-08T22:59:36.886 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:283: TEST_rados_get_subread_missing: setup_osds 4 2026-03-08T22:59:36.886 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:47: setup_osds: local count=4 2026-03-08T22:59:36.886 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:48: setup_osds: shift 2026-03-08T22:59:36.887 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: expr 4 - 1 2026-03-08T22:59:36.888 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: seq 0 3 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 0 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/0 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/0' 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/0/journal' 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:59:36.889 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:59:36.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:59:36.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:59:36.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:59:36.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:59:36.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:59:36.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:59:36.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:59:36.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:59:36.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:59:36.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:59:36.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:59:36.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:59:36.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/0 2026-03-08T22:59:36.891 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:59:36.892 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=c513a8b3-4d0b-4f34-9672-dd67f2c2136e 2026-03-08T22:59:36.892 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 c513a8b3-4d0b-4f34-9672-dd67f2c2136e 2026-03-08T22:59:36.892 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 c513a8b3-4d0b-4f34-9672-dd67f2c2136e' 2026-03-08T22:59:36.892 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:59:36.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBY/61pIG8RNhAAReqKHygNOOzHyiGWRuTgMg== 2026-03-08T22:59:36.906 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBY/61pIG8RNhAAReqKHygNOOzHyiGWRuTgMg=="}' 2026-03-08T22:59:36.907 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new c513a8b3-4d0b-4f34-9672-dd67f2c2136e -i td/test-erasure-eio/0/new.json 2026-03-08T22:59:37.037 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:59:37.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/0/new.json 2026-03-08T22:59:37.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBY/61pIG8RNhAAReqKHygNOOzHyiGWRuTgMg== --osd-uuid c513a8b3-4d0b-4f34-9672-dd67f2c2136e 2026-03-08T22:59:37.075 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:37.075+0000 7fa449a2e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:37.077 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:37.080+0000 7fa449a2e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:37.081 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:37.083+0000 7fa449a2e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:37.081 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:37.083+0000 7fa449a2e780 -1 bdev(0x55862ac58800 td/test-erasure-eio/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:59:37.081 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:37.083+0000 7fa449a2e780 -1 bluestore(td/test-erasure-eio/0) _read_fsid unparsable uuid 2026-03-08T22:59:39.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/0/keyring 2026-03-08T22:59:39.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:59:39.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:59:39.735 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T22:59:39.735 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:59:40.048 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:59:40.048 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T22:59:40.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:59:40.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:59:40.050 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:59:40.051 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:59:40.069 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:40.069+0000 7f543102d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:40.073 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:40.075+0000 7f543102d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:40.076 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:40.077+0000 7f543102d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:40.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:59:40.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:59:40.310 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:59:40.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:59:40.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:59:40.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:59:40.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:40.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:59:40.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:40.310 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:59:40.536 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:41.160 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:41.162+0000 7f543102d780 -1 Falling back to public interface 2026-03-08T22:59:41.537 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:41.537 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:41.537 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:59:41.537 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:59:41.538 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:41.538 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:59:41.749 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:42.004 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:42.006+0000 7f543102d780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:59:42.751 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:59:42.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:42.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:42.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:59:42.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:59:42.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:42.981 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:43.984 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:59:43.984 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:43.984 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:43.984 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:59:43.984 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:43.984 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:59:44.214 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 6 up_thru 6 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/850479362,v1:127.0.0.1:6803/850479362] [v2:127.0.0.1:6804/850479362,v1:127.0.0.1:6805/850479362] exists,up c513a8b3-4d0b-4f34-9672-dd67f2c2136e 2026-03-08T22:59:44.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:59:44.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:59:44.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:59:44.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:59:44.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 1 2026-03-08T22:59:44.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:59:44.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:59:44.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:59:44.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:59:44.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T22:59:44.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:59:44.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:59:44.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:59:44.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:59:44.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T22:59:44.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T22:59:44.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:59:44.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:59:44.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:59:44.215 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:59:44.215 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:59:44.215 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:59:44.215 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:59:44.215 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:44.215 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:59:44.215 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:59:44.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:59:44.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:59:44.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:59:44.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:59:44.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:59:44.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:59:44.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:59:44.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:59:44.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:59:44.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:59:44.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:59:44.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:59:44.216 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T22:59:44.217 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:59:44.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=5e7827e4-0b23-45db-a803-25191a0559fa 2026-03-08T22:59:44.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 5e7827e4-0b23-45db-a803-25191a0559fa' 2026-03-08T22:59:44.218 INFO:tasks.workunit.client.0.vm04.stdout:add osd1 5e7827e4-0b23-45db-a803-25191a0559fa 2026-03-08T22:59:44.218 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:59:44.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBg/61pbjLfDRAAtsAzQSGaXXj3yC6677x8fw== 2026-03-08T22:59:44.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBg/61pbjLfDRAAtsAzQSGaXXj3yC6677x8fw=="}' 2026-03-08T22:59:44.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 5e7827e4-0b23-45db-a803-25191a0559fa -i td/test-erasure-eio/1/new.json 2026-03-08T22:59:44.461 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:59:44.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/1/new.json 2026-03-08T22:59:44.474 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBg/61pbjLfDRAAtsAzQSGaXXj3yC6677x8fw== --osd-uuid 5e7827e4-0b23-45db-a803-25191a0559fa 2026-03-08T22:59:44.497 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:44.499+0000 7f8949212780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:44.499 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:44.501+0000 7f8949212780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:44.500 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:44.502+0000 7f8949212780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:44.501 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:44.503+0000 7f8949212780 -1 bdev(0x55bbae137c00 td/test-erasure-eio/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:59:44.501 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:44.503+0000 7f8949212780 -1 bluestore(td/test-erasure-eio/1) _read_fsid unparsable uuid 2026-03-08T22:59:47.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/1/keyring 2026-03-08T22:59:47.219 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:59:47.220 INFO:tasks.workunit.client.0.vm04.stdout:adding osd1 key to auth repository 2026-03-08T22:59:47.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:59:47.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:59:47.515 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:59:47.515 INFO:tasks.workunit.client.0.vm04.stdout:start osd.1 2026-03-08T22:59:47.516 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:59:47.516 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:59:47.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:59:47.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:59:47.535 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:47.536+0000 7f4279a0d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:47.537 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:47.539+0000 7f4279a0d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:47.538 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:47.540+0000 7f4279a0d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:47.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:59:47.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:59:47.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:59:47.751 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:59:47.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:59:47.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:47.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:59:47.752 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:59:47.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:47.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:59:47.987 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:48.618 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:48.619+0000 7f4279a0d780 -1 Falling back to public interface 2026-03-08T22:59:48.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:48.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:48.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:59:48.989 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:59:48.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:48.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:59:49.218 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:49.460 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:49.462+0000 7f4279a0d780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:59:50.220 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:59:50.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:50.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:50.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:59:50.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:50.221 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:59:50.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:51.476 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:59:51.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:51.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:51.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:59:51.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:51.476 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:59:51.715 INFO:tasks.workunit.client.0.vm04.stdout:osd.1 up in weight 1 up_from 13 up_thru 13 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/205716372,v1:127.0.0.1:6811/205716372] [v2:127.0.0.1:6812/205716372,v1:127.0.0.1:6813/205716372] exists,up 5e7827e4-0b23-45db-a803-25191a0559fa 2026-03-08T22:59:51.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:59:51.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:59:51.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:59:51.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:59:51.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 2 2026-03-08T22:59:51.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:59:51.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:59:51.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:59:51.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:59:51.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T22:59:51.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:59:51.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:59:51.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:59:51.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:59:51.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T22:59:51.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T22:59:51.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:59:51.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:59:51.716 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:59:51.716 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:59:51.716 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:59:51.716 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:59:51.716 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:59:51.716 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:51.716 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:59:51.716 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:59:51.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:59:51.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:59:51.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:59:51.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:59:51.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:59:51.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:59:51.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:59:51.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:59:51.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:59:51.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:59:51.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:59:51.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:59:51.717 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T22:59:51.719 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:59:51.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=43ff396e-2790-4e11-85bc-344416d2c30d 2026-03-08T22:59:51.720 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 43ff396e-2790-4e11-85bc-344416d2c30d' 2026-03-08T22:59:51.720 INFO:tasks.workunit.client.0.vm04.stdout:add osd2 43ff396e-2790-4e11-85bc-344416d2c30d 2026-03-08T22:59:51.720 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:59:51.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBn/61pXe7aKxAAsMVvLYoq0ZI4VlhRrkJ+ew== 2026-03-08T22:59:51.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBn/61pXe7aKxAAsMVvLYoq0ZI4VlhRrkJ+ew=="}' 2026-03-08T22:59:51.734 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 43ff396e-2790-4e11-85bc-344416d2c30d -i td/test-erasure-eio/2/new.json 2026-03-08T22:59:51.966 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:59:51.988 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/2/new.json 2026-03-08T22:59:51.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBn/61pXe7aKxAAsMVvLYoq0ZI4VlhRrkJ+ew== --osd-uuid 43ff396e-2790-4e11-85bc-344416d2c30d 2026-03-08T22:59:52.007 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:52.009+0000 7fafa92ce780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:52.009 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:52.011+0000 7fafa92ce780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:52.010 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:52.012+0000 7fafa92ce780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:52.011 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:52.013+0000 7fafa92ce780 -1 bdev(0x5640d636fc00 td/test-erasure-eio/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:59:52.011 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:52.013+0000 7fafa92ce780 -1 bluestore(td/test-erasure-eio/2) _read_fsid unparsable uuid 2026-03-08T22:59:54.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/2/keyring 2026-03-08T22:59:54.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:59:54.396 INFO:tasks.workunit.client.0.vm04.stdout:adding osd2 key to auth repository 2026-03-08T22:59:54.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:59:54.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:59:54.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:59:54.692 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T22:59:54.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:59:54.692 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:59:54.693 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:59:54.695 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:59:54.711 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:54.713+0000 7f322ac0c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:54.714 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:54.716+0000 7f322ac0c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:54.716 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:54.717+0000 7f322ac0c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:54.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:59:54.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:59:54.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:59:54.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:59:54.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:59:54.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:54.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:59:54.915 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T22:59:54.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:54.915 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:59:55.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:56.037 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:56.039+0000 7f322ac0c780 -1 Falling back to public interface 2026-03-08T22:59:56.156 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:56.156 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:56.156 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:59:56.156 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T22:59:56.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:56.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:59:56.380 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:56.902 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:56.904+0000 7f322ac0c780 -1 osd.2 0 log_to_monitors true 2026-03-08T22:59:57.383 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T22:59:57.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:57.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:57.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:59:57.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:57.383 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:59:57.609 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:57.885 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:57.887+0000 7f3225cd3640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T22:59:58.611 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:58.611 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:58.611 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:59:58.611 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:59:58.612 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:58.612 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:59:58.844 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up in weight 1 up_from 19 up_thru 19 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/3102640871,v1:127.0.0.1:6819/3102640871] [v2:127.0.0.1:6820/3102640871,v1:127.0.0.1:6821/3102640871] exists,up 43ff396e-2790-4e11-85bc-344416d2c30d 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 3 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/3 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/3' 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/3/journal' 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T22:59:58.845 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:59:58.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T22:59:58.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:59:58.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:59:58.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:59:58.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T22:59:58.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T22:59:58.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:59:58.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:59:58.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:59:58.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:59:58.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:59:58.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:59:58.846 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/3 2026-03-08T22:59:58.847 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:59:58.848 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=6627495f-020d-4d92-9f7d-f758608982f0 2026-03-08T22:59:58.848 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 6627495f-020d-4d92-9f7d-f758608982f0' 2026-03-08T22:59:58.848 INFO:tasks.workunit.client.0.vm04.stdout:add osd3 6627495f-020d-4d92-9f7d-f758608982f0 2026-03-08T22:59:58.848 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:59:58.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBu/61p5S51MxAAhlxWj06mTS+ftboN9fEs8g== 2026-03-08T22:59:58.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBu/61p5S51MxAAhlxWj06mTS+ftboN9fEs8g=="}' 2026-03-08T22:59:58.862 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 6627495f-020d-4d92-9f7d-f758608982f0 -i td/test-erasure-eio/3/new.json 2026-03-08T22:59:59.099 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T22:59:59.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/3/new.json 2026-03-08T22:59:59.112 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBu/61p5S51MxAAhlxWj06mTS+ftboN9fEs8g== --osd-uuid 6627495f-020d-4d92-9f7d-f758608982f0 2026-03-08T22:59:59.131 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:59.133+0000 7f8f8741c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:59.133 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:59.135+0000 7f8f8741c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:59.134 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:59.136+0000 7f8f8741c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:59.135 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:59.137+0000 7f8f8741c780 -1 bdev(0x561502317c00 td/test-erasure-eio/3/block) open stat got: (1) Operation not permitted 2026-03-08T22:59:59.135 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T22:59:59.137+0000 7f8f8741c780 -1 bluestore(td/test-erasure-eio/3) _read_fsid unparsable uuid 2026-03-08T23:00:01.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/3/keyring 2026-03-08T23:00:01.379 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:00:01.380 INFO:tasks.workunit.client.0.vm04.stdout:adding osd3 key to auth repository 2026-03-08T23:00:01.380 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T23:00:01.380 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:00:01.684 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T23:00:01.684 INFO:tasks.workunit.client.0.vm04.stdout:start osd.3 2026-03-08T23:00:01.684 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:00:01.684 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:00:01.685 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:00:01.687 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:00:01.703 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:01.705+0000 7f62b362a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:01.709 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:01.711+0000 7f62b362a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:01.711 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:01.713+0000 7f62b362a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:01.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T23:00:01.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:00:01.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T23:00:01.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:00:01.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:00:01.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:01.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:00:01.924 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T23:00:01.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:00:01.924 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:02.157 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:02.528 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:02.530+0000 7f62b362a780 -1 Falling back to public interface 2026-03-08T23:00:03.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:03.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:03.159 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T23:00:03.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:00:03.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:03.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:00:03.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:03.628 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:03.630+0000 7f62b362a780 -1 osd.3 0 log_to_monitors true 2026-03-08T23:00:04.398 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T23:00:04.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:04.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:04.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:00:04.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:04.398 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:00:04.632 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:04.930 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:04.932+0000 7f62aedc9640 -1 osd.3 0 waiting for initial osdmap 2026-03-08T23:00:05.634 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T23:00:05.634 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:05.634 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:05.634 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:00:05.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:05.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:00:05.868 INFO:tasks.workunit.client.0.vm04.stdout:osd.3 up in weight 1 up_from 27 up_thru 28 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/3903983925,v1:127.0.0.1:6827/3903983925] [v2:127.0.0.1:6828/3903983925,v1:127.0.0.1:6829/3903983925] exists,up 6627495f-020d-4d92-9f7d-f758608982f0 2026-03-08T23:00:05.868 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:00:05.868 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:00:05.868 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:00:05.869 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: get_asok_path osd.0 2026-03-08T23:00:05.869 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T23:00:05.869 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T23:00:05.869 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:00:05.869 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:05.869 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:00:05.870 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.0.asok 2026-03-08T23:00:05.870 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: CEPH_ARGS= 2026-03-08T23:00:05.870 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.0.asok log flush 2026-03-08T23:00:05.925 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T23:00:05.931 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:56: setup_osds: grep 'load: jerasure.*lrc' td/test-erasure-eio/osd.0.log 2026-03-08T23:00:05.933 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T22:59:41.166+0000 7f543102d780 0 load: jerasure load: lrc 2026-03-08T23:00:05.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:285: TEST_rados_get_subread_missing: local poolname=pool-jerasure 2026-03-08T23:00:05.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:286: TEST_rados_get_subread_missing: create_erasure_coded_pool pool-jerasure 2 1 2026-03-08T23:00:05.933 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:67: create_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T23:00:05.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:68: create_erasure_coded_pool: shift 2026-03-08T23:00:05.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:69: create_erasure_coded_pool: local k=2 2026-03-08T23:00:05.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:70: create_erasure_coded_pool: shift 2026-03-08T23:00:05.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:71: create_erasure_coded_pool: local m=1 2026-03-08T23:00:05.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:72: create_erasure_coded_pool: shift 2026-03-08T23:00:05.934 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:74: create_erasure_coded_pool: ceph osd erasure-code-profile set myprofile plugin=jerasure k=2 m=1 crush-failure-domain=osd 2026-03-08T23:00:06.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:78: create_erasure_coded_pool: create_pool pool-jerasure 1 1 erasure myprofile 2026-03-08T23:00:06.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create pool-jerasure 1 1 erasure myprofile 2026-03-08T23:00:06.617 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' already exists 2026-03-08T23:00:06.629 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:00:07.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:80: create_erasure_coded_pool: wait_for_clean 2026-03-08T23:00:07.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:00:07.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:00:07.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:00:07.630 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:00:07.631 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:00:07.631 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:00:07.631 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:00:07.631 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:00:07.631 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:00:07.704 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:00:07.704 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:00:07.704 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:00:07.704 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:00:07.704 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:00:07.704 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:00:07.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:00:07.991 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T23:00:07.991 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T23:00:07.991 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T23:00:07.991 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:00:07.991 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:07.991 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:00:08.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803783 2026-03-08T23:00:08.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803783 2026-03-08T23:00:08.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783' 2026-03-08T23:00:08.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:08.088 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:00:08.166 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574854 2026-03-08T23:00:08.166 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574854 2026-03-08T23:00:08.166 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783 1-55834574854' 2026-03-08T23:00:08.166 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:08.166 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:00:08.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378629 2026-03-08T23:00:08.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378629 2026-03-08T23:00:08.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783 1-55834574854 2-81604378629' 2026-03-08T23:00:08.247 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:08.247 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:00:08.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964116995 2026-03-08T23:00:08.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964116995 2026-03-08T23:00:08.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783 1-55834574854 2-81604378629 3-115964116995' 2026-03-08T23:00:08.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:08.329 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803783 2026-03-08T23:00:08.329 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:08.330 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:00:08.331 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803783 2026-03-08T23:00:08.331 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:08.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803783 2026-03-08T23:00:08.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803783' 2026-03-08T23:00:08.332 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803783 2026-03-08T23:00:08.333 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:00:08.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803781 -lt 25769803783 2026-03-08T23:00:08.566 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:00:09.567 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:00:09.567 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:00:09.831 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803781 -lt 25769803783 2026-03-08T23:00:09.831 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:00:10.832 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:00:10.832 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:00:11.061 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803784 -lt 25769803783 2026-03-08T23:00:11.061 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:11.062 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:11.062 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574854 2026-03-08T23:00:11.063 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:00:11.063 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574854 2026-03-08T23:00:11.063 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:11.064 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574854 2026-03-08T23:00:11.064 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574854' 2026-03-08T23:00:11.064 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 55834574854 2026-03-08T23:00:11.065 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:00:11.317 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574854 -lt 55834574854 2026-03-08T23:00:11.318 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:11.318 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378629 2026-03-08T23:00:11.318 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:11.319 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:00:11.319 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378629 2026-03-08T23:00:11.320 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:11.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378629 2026-03-08T23:00:11.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378629' 2026-03-08T23:00:11.321 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 81604378629 2026-03-08T23:00:11.321 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:00:11.552 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378629 -lt 81604378629 2026-03-08T23:00:11.552 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:11.552 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964116995 2026-03-08T23:00:11.552 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:11.553 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:00:11.554 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964116995 2026-03-08T23:00:11.554 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:11.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964116995 2026-03-08T23:00:11.555 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964116995' 2026-03-08T23:00:11.555 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 115964116995 2026-03-08T23:00:11.555 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:00:11.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964116995 -lt 115964116995 2026-03-08T23:00:11.789 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:00:11.790 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:00:11.790 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:00:12.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:00:12.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:00:12.130 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:00:12.130 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:00:12.130 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:00:12.130 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:00:12.131 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:00:12.131 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:00:12.357 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:00:12.358 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:00:12.358 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:00:12.358 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:00:12.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:00:12.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:00:12.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:00:12.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:288: TEST_rados_get_subread_missing: local shard_id=1 2026-03-08T23:00:12.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:289: TEST_rados_get_subread_missing: rados_put_get_data remove td/test-erasure-eio 1 2026-03-08T23:00:12.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:147: rados_put_get_data: local inject=remove 2026-03-08T23:00:12.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:148: rados_put_get_data: shift 2026-03-08T23:00:12.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:149: rados_put_get_data: local dir=td/test-erasure-eio 2026-03-08T23:00:12.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:150: rados_put_get_data: shift 2026-03-08T23:00:12.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:151: rados_put_get_data: local shard_id=1 2026-03-08T23:00:12.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:152: rados_put_get_data: shift 2026-03-08T23:00:12.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:153: rados_put_get_data: local arg= 2026-03-08T23:00:12.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:157: rados_put_get_data: local poolname=pool-jerasure 2026-03-08T23:00:12.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:158: rados_put_get_data: local objname=obj-remove-102080-1 2026-03-08T23:00:12.666 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:159: rados_put_get_data: rados_put td/test-erasure-eio pool-jerasure obj-remove-102080-1 2026-03-08T23:00:12.667 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T23:00:12.667 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T23:00:12.667 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=obj-remove-102080-1 2026-03-08T23:00:12.667 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T23:00:12.667 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T23:00:12.667 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T23:00:12.667 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T23:00:12.667 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T23:00:12.667 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T23:00:12.667 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T23:00:12.667 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T23:00:12.667 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put obj-remove-102080-1 td/test-erasure-eio/ORIGINAL 2026-03-08T23:00:12.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:160: rados_put_get_data: inject_remove ec data pool-jerasure obj-remove-102080-1 td/test-erasure-eio 1 2026-03-08T23:00:12.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:127: inject_remove: local pooltype=ec 2026-03-08T23:00:12.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:128: inject_remove: shift 2026-03-08T23:00:12.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:129: inject_remove: local which=data 2026-03-08T23:00:12.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:130: inject_remove: shift 2026-03-08T23:00:12.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:131: inject_remove: local poolname=pool-jerasure 2026-03-08T23:00:12.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:132: inject_remove: shift 2026-03-08T23:00:12.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:133: inject_remove: local objname=obj-remove-102080-1 2026-03-08T23:00:12.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:134: inject_remove: shift 2026-03-08T23:00:12.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:135: inject_remove: local dir=td/test-erasure-eio 2026-03-08T23:00:12.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:136: inject_remove: shift 2026-03-08T23:00:12.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:137: inject_remove: local shard_id=1 2026-03-08T23:00:12.696 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:138: inject_remove: shift 2026-03-08T23:00:12.697 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:140: inject_remove: get_osds pool-jerasure obj-remove-102080-1 2026-03-08T23:00:12.697 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T23:00:12.697 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-remove-102080-1 2026-03-08T23:00:12.697 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-remove-102080-1 2026-03-08T23:00:12.697 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:00:12.924 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T23:00:12.924 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T23:00:12.925 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T23:00:12.925 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T23:00:12.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:140: inject_remove: initial_osds=('3' '1' '2') 2026-03-08T23:00:12.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:140: inject_remove: local -a initial_osds 2026-03-08T23:00:12.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:141: inject_remove: local osd_id=1 2026-03-08T23:00:12.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:142: inject_remove: objectstore_tool td/test-erasure-eio 1 obj-remove-102080-1 remove 2026-03-08T23:00:12.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-eio 2026-03-08T23:00:12.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:00:12.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:00:12.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:00:12.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-eio 1 obj-remove-102080-1 remove 2026-03-08T23:00:12.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-eio 2026-03-08T23:00:12.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:00:12.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:00:12.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:00:12.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-eio TERM osd.1 2026-03-08T23:00:12.926 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:00:12.926 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:00:12.926 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:00:12.926 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:00:12.926 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:00:13.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:00:13.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-eio 1 obj-remove-102080-1 remove 2026-03-08T23:00:13.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-eio 2026-03-08T23:00:13.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:00:13.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:00:13.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:00:13.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-eio/1 2026-03-08T23:00:13.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-eio/1 obj-remove-102080-1 remove 2026-03-08T23:00:13.638 INFO:tasks.workunit.client.0.vm04.stdout:remove 1#2:e460e160:::obj-remove-102080-1:head# 2026-03-08T23:00:14.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-eio 1 2026-03-08T23:00:14.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T23:00:14.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:00:14.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:00:14.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:00:14.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T23:00:14.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T23:00:14.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:00:14.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:00:14.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:00:14.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T23:00:14.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T23:00:14.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:00:14.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:00:14.158 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T23:00:14.158 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:00:14.159 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:00:14.159 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:00:14.159 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:00:14.159 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:14.159 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:00:14.159 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:00:14.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:00:14.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:00:14.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T23:00:14.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T23:00:14.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:00:14.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:00:14.159 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:00:14.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:00:14.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:00:14.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:00:14.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T23:00:14.160 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:00:14.160 INFO:tasks.workunit.client.0.vm04.stderr:start osd.1 2026-03-08T23:00:14.161 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:00:14.161 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/1/whoami 2026-03-08T23:00:14.162 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:00:14.162 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:00:14.163 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:00:14.165 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:00:14.180 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:14.181+0000 7f94fd20d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:14.182 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:14.184+0000 7f94fd20d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:14.183 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:14.185+0000 7f94fd20d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:14.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:00:14.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:00:14.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:00:14.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:00:14.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:00:14.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:14.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:00:14.412 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T23:00:14.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:14.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:14.648 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:15.259 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:15.261+0000 7f94fd20d780 -1 Falling back to public interface 2026-03-08T23:00:15.650 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:15.650 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:15.650 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:00:15.650 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T23:00:15.651 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:15.651 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:15.882 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:16.365 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:16.367+0000 7f94fd20d780 -1 osd.1 35 log_to_monitors true 2026-03-08T23:00:16.884 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:16.885 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:16.885 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:00:16.885 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T23:00:16.885 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:16.885 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:17.125 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:18.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:18.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:18.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:00:18.131 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T23:00:18.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:18.131 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:18.353 INFO:tasks.workunit.client.0.vm04.stderr:osd.1 up in weight 1 up_from 39 up_thru 39 down_at 36 last_clean_interval [13,35) [v2:127.0.0.1:6810/1839257547,v1:127.0.0.1:6811/1839257547] [v2:127.0.0.1:6812/1839257547,v1:127.0.0.1:6813/1839257547] exists,up 5e7827e4-0b23-45db-a803-25191a0559fa 2026-03-08T23:00:18.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:00:18.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:00:18.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:00:18.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:00:18.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:00:18.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:00:18.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:00:18.354 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:00:18.354 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:00:18.354 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:00:18.355 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:00:18.355 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:00:18.355 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:00:18.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:00:18.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:00:18.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:00:18.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:00:18.450 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:00:18.451 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:00:18.673 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:00:18.674 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T23:00:18.674 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T23:00:18.674 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T23:00:18.674 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:00:18.674 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:18.674 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:00:18.754 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803788 2026-03-08T23:00:18.754 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803788 2026-03-08T23:00:18.754 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803788' 2026-03-08T23:00:18.754 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:18.755 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:00:18.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724547 2026-03-08T23:00:18.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724547 2026-03-08T23:00:18.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803788 1-167503724547' 2026-03-08T23:00:18.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:18.831 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:00:18.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378633 2026-03-08T23:00:18.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378633 2026-03-08T23:00:18.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803788 1-167503724547 2-81604378633' 2026-03-08T23:00:18.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:18.910 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:00:18.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964116999 2026-03-08T23:00:18.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964116999 2026-03-08T23:00:18.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803788 1-167503724547 2-81604378633 3-115964116999' 2026-03-08T23:00:18.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:18.990 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803788 2026-03-08T23:00:18.990 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:18.991 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:00:18.991 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803788 2026-03-08T23:00:18.991 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:18.992 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803788 2026-03-08T23:00:18.992 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803788' 2026-03-08T23:00:18.992 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 25769803788 2026-03-08T23:00:18.992 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:00:19.229 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803785 -lt 25769803788 2026-03-08T23:00:19.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:00:20.231 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:00:20.231 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:00:20.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803788 -lt 25769803788 2026-03-08T23:00:20.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:20.466 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-167503724547 2026-03-08T23:00:20.466 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:20.467 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:00:20.468 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-167503724547 2026-03-08T23:00:20.468 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:20.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724547 2026-03-08T23:00:20.469 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 167503724547' 2026-03-08T23:00:20.469 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 167503724547 2026-03-08T23:00:20.470 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:00:20.700 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724547 -lt 167503724547 2026-03-08T23:00:20.700 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:20.701 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378633 2026-03-08T23:00:20.701 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:20.702 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:00:20.703 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378633 2026-03-08T23:00:20.703 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:20.704 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378633 2026-03-08T23:00:20.704 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378633' 2026-03-08T23:00:20.704 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 81604378633 2026-03-08T23:00:20.704 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:00:20.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378633 -lt 81604378633 2026-03-08T23:00:20.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:20.929 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964116999 2026-03-08T23:00:20.929 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:20.930 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:00:20.931 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964116999 2026-03-08T23:00:20.931 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:20.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964116999 2026-03-08T23:00:20.932 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964116999' 2026-03-08T23:00:20.932 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 115964116999 2026-03-08T23:00:20.933 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:00:21.152 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964116999 -lt 115964116999 2026-03-08T23:00:21.153 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:00:21.153 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:00:21.153 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:00:21.474 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:00:21.474 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:00:21.475 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:00:21.475 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:00:21.475 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:00:21.475 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:00:21.475 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:00:21.475 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:00:21.705 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:00:21.706 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:00:21.706 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:00:21.706 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:00:22.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:00:22.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:00:22.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:00:22.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:161: rados_put_get_data: rados_get td/test-erasure-eio pool-jerasure obj-remove-102080-1 2026-03-08T23:00:22.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T23:00:22.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T23:00:22.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=obj-remove-102080-1 2026-03-08T23:00:22.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=ok 2026-03-08T23:00:22.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' ok = fail ']' 2026-03-08T23:00:22.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:120: rados_get: rados --pool pool-jerasure get obj-remove-102080-1 td/test-erasure-eio/COPY 2026-03-08T23:00:22.036 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:121: rados_get: diff td/test-erasure-eio/ORIGINAL td/test-erasure-eio/COPY 2026-03-08T23:00:22.037 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:122: rados_get: rm td/test-erasure-eio/COPY 2026-03-08T23:00:22.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:163: rados_put_get_data: '[' '' = recovery ']' 2026-03-08T23:00:22.039 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:181: rados_put_get_data: expr 1 + 1 2026-03-08T23:00:22.040 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:181: rados_put_get_data: shard_id=2 2026-03-08T23:00:22.040 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:182: rados_put_get_data: inject_remove ec data pool-jerasure obj-remove-102080-1 td/test-erasure-eio 2 2026-03-08T23:00:22.040 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:127: inject_remove: local pooltype=ec 2026-03-08T23:00:22.040 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:128: inject_remove: shift 2026-03-08T23:00:22.040 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:129: inject_remove: local which=data 2026-03-08T23:00:22.040 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:130: inject_remove: shift 2026-03-08T23:00:22.040 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:131: inject_remove: local poolname=pool-jerasure 2026-03-08T23:00:22.040 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:132: inject_remove: shift 2026-03-08T23:00:22.040 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:133: inject_remove: local objname=obj-remove-102080-1 2026-03-08T23:00:22.040 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:134: inject_remove: shift 2026-03-08T23:00:22.040 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:135: inject_remove: local dir=td/test-erasure-eio 2026-03-08T23:00:22.040 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:136: inject_remove: shift 2026-03-08T23:00:22.040 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:137: inject_remove: local shard_id=2 2026-03-08T23:00:22.040 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:138: inject_remove: shift 2026-03-08T23:00:22.041 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:140: inject_remove: get_osds pool-jerasure obj-remove-102080-1 2026-03-08T23:00:22.041 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T23:00:22.041 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-remove-102080-1 2026-03-08T23:00:22.041 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-remove-102080-1 2026-03-08T23:00:22.042 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:00:22.282 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T23:00:22.282 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T23:00:22.282 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T23:00:22.282 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T23:00:22.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:140: inject_remove: initial_osds=('3' '1' '2') 2026-03-08T23:00:22.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:140: inject_remove: local -a initial_osds 2026-03-08T23:00:22.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:141: inject_remove: local osd_id=2 2026-03-08T23:00:22.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:142: inject_remove: objectstore_tool td/test-erasure-eio 2 obj-remove-102080-1 remove 2026-03-08T23:00:22.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/test-erasure-eio 2026-03-08T23:00:22.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:00:22.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=2 2026-03-08T23:00:22.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:00:22.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/test-erasure-eio 2 obj-remove-102080-1 remove 2026-03-08T23:00:22.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/test-erasure-eio 2026-03-08T23:00:22.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:00:22.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=2 2026-03-08T23:00:22.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:00:22.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/test-erasure-eio TERM osd.2 2026-03-08T23:00:22.283 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:00:22.283 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:00:22.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:00:22.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:00:22.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:00:22.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:00:22.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/test-erasure-eio 2 obj-remove-102080-1 remove 2026-03-08T23:00:22.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/test-erasure-eio 2026-03-08T23:00:22.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:00:22.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=2 2026-03-08T23:00:22.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:00:22.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/test-erasure-eio/2 2026-03-08T23:00:22.390 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/test-erasure-eio/2 obj-remove-102080-1 remove 2026-03-08T23:00:22.999 INFO:tasks.workunit.client.0.vm04.stdout:remove 2#2:e460e160:::obj-remove-102080-1:head# 2026-03-08T23:00:23.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/test-erasure-eio 2 2026-03-08T23:00:23.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T23:00:23.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:00:23.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T23:00:23.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:00:23.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T23:00:23.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T23:00:23.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:00:23.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:00:23.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:00:23.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T23:00:23.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T23:00:23.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:00:23.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:00:23.517 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T23:00:23.517 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:00:23.517 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:00:23.517 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:00:23.518 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:00:23.518 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:23.518 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:00:23.518 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:00:23.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:00:23.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:00:23.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T23:00:23.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T23:00:23.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:00:23.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:00:23.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:00:23.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:00:23.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:00:23.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:00:23.519 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T23:00:23.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T23:00:23.520 INFO:tasks.workunit.client.0.vm04.stderr:start osd.2 2026-03-08T23:00:23.520 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:00:23.520 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/2/whoami 2026-03-08T23:00:23.521 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T23:00:23.522 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:00:23.522 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:00:23.525 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:00:23.539 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:23.541+0000 7f0534382780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:23.543 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:23.546+0000 7f0534382780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:23.546 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:23.547+0000 7f0534382780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:23.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T23:00:23.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:00:23.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:00:23.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:00:23.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:00:23.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:23.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:00:23.752 INFO:tasks.workunit.client.0.vm04.stderr:0 2026-03-08T23:00:23.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:23.752 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:00:23.997 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:24.353 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:24.355+0000 7f0534382780 -1 Falling back to public interface 2026-03-08T23:00:24.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:24.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:24.999 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:00:24.999 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T23:00:25.000 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:25.000 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:00:25.235 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:25.952 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:25.954+0000 7f0534382780 -1 osd.2 40 log_to_monitors true 2026-03-08T23:00:26.238 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:26.238 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:26.238 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:00:26.238 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T23:00:26.238 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:26.238 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:00:26.493 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:27.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:27.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:27.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:00:27.495 INFO:tasks.workunit.client.0.vm04.stderr:3 2026-03-08T23:00:27.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:27.495 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:00:27.739 INFO:tasks.workunit.client.0.vm04.stderr:osd.2 up in weight 1 up_from 44 up_thru 36 down_at 41 last_clean_interval [19,40) [v2:127.0.0.1:6818/472169759,v1:127.0.0.1:6819/472169759] [v2:127.0.0.1:6820/472169759,v1:127.0.0.1:6821/472169759] exists,up 43ff396e-2790-4e11-85bc-344416d2c30d 2026-03-08T23:00:27.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:00:27.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:00:27.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:00:27.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:00:27.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:00:27.740 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:00:27.740 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:00:27.740 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:00:27.740 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:00:27.740 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:00:27.740 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:00:27.741 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:00:27.741 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:00:27.829 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:00:27.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:00:27.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:00:27.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:00:27.830 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:00:27.830 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:00:28.068 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:00:28.069 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T23:00:28.069 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T23:00:28.069 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T23:00:28.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:00:28.069 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:28.069 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:00:28.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803792 2026-03-08T23:00:28.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803792 2026-03-08T23:00:28.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803792' 2026-03-08T23:00:28.230 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:28.231 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:00:28.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724551 2026-03-08T23:00:28.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724551 2026-03-08T23:00:28.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803792 1-167503724551' 2026-03-08T23:00:28.309 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:28.309 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:00:28.386 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561027 2026-03-08T23:00:28.386 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561027 2026-03-08T23:00:28.386 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803792 1-167503724551 2-188978561027' 2026-03-08T23:00:28.386 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:28.387 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:00:28.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964117003 2026-03-08T23:00:28.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964117003 2026-03-08T23:00:28.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803792 1-167503724551 2-188978561027 3-115964117003' 2026-03-08T23:00:28.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:28.466 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803792 2026-03-08T23:00:28.466 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:28.467 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:00:28.467 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803792 2026-03-08T23:00:28.467 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:28.468 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803792 2026-03-08T23:00:28.468 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803792' 2026-03-08T23:00:28.468 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.0 seq 25769803792 2026-03-08T23:00:28.469 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:00:28.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803789 -lt 25769803792 2026-03-08T23:00:28.706 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:00:29.707 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:00:29.707 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:00:30.085 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803789 -lt 25769803792 2026-03-08T23:00:30.086 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:00:31.087 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:00:31.087 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:00:31.331 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803792 -lt 25769803792 2026-03-08T23:00:31.332 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:31.332 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-167503724551 2026-03-08T23:00:31.332 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:31.334 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:00:31.334 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-167503724551 2026-03-08T23:00:31.334 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:31.335 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724551 2026-03-08T23:00:31.335 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 167503724551' 2026-03-08T23:00:31.335 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.1 seq 167503724551 2026-03-08T23:00:31.336 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:00:31.590 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724551 -lt 167503724551 2026-03-08T23:00:31.590 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:31.590 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-188978561027 2026-03-08T23:00:31.590 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:31.591 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:00:31.592 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-188978561027 2026-03-08T23:00:31.592 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:31.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561027 2026-03-08T23:00:31.593 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 188978561027' 2026-03-08T23:00:31.593 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.2 seq 188978561027 2026-03-08T23:00:31.593 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:00:31.838 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561027 -lt 188978561027 2026-03-08T23:00:31.838 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:31.838 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964117003 2026-03-08T23:00:31.838 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:31.840 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:00:31.840 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964117003 2026-03-08T23:00:31.840 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:31.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964117003 2026-03-08T23:00:31.841 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964117003' 2026-03-08T23:00:31.841 INFO:tasks.workunit.client.0.vm04.stderr:waiting osd.3 seq 115964117003 2026-03-08T23:00:31.842 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:00:32.081 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964117004 -lt 115964117003 2026-03-08T23:00:32.082 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:00:32.082 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:00:32.082 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:00:32.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:00:32.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:00:32.395 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:00:32.395 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:00:32.395 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:00:32.396 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:00:32.396 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:00:32.396 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:00:32.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:00:32.625 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:00:32.625 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:00:32.625 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:00:32.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:00:32.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:00:32.948 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:00:32.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:183: rados_put_get_data: rados_get td/test-erasure-eio pool-jerasure obj-remove-102080-1 fail 2026-03-08T23:00:32.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T23:00:32.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T23:00:32.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=obj-remove-102080-1 2026-03-08T23:00:32.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=fail 2026-03-08T23:00:32.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' fail = fail ']' 2026-03-08T23:00:32.949 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:114: rados_get: rados --pool pool-jerasure get obj-remove-102080-1 td/test-erasure-eio/COPY 2026-03-08T23:00:32.974 INFO:tasks.workunit.client.0.vm04.stderr:error getting pool-jerasure/obj-remove-102080-1: (5) Input/output error 2026-03-08T23:00:32.976 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:115: rados_get: return 2026-03-08T23:00:32.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:184: rados_put_get_data: rm td/test-erasure-eio/ORIGINAL 2026-03-08T23:00:32.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:290: TEST_rados_get_subread_missing: delete_erasure_coded_pool pool-jerasure 2026-03-08T23:00:32.977 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:84: delete_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T23:00:32.978 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:85: delete_erasure_coded_pool: ceph osd pool delete pool-jerasure pool-jerasure --yes-i-really-really-mean-it 2026-03-08T23:00:33.250 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' does not exist 2026-03-08T23:00:33.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:86: delete_erasure_coded_pool: ceph osd erasure-code-profile rm myprofile 2026-03-08T23:00:33.560 INFO:tasks.workunit.client.0.vm04.stderr:erasure-code-profile myprofile does not exist 2026-03-08T23:00:33.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:42: run: teardown td/test-erasure-eio 2026-03-08T23:00:33.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T23:00:33.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:00:33.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T23:00:33.573 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:00:33.573 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:00:33.573 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:00:33.573 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:00:33.573 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:00:33.709 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:00:33.709 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:00:33.710 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:00:33.710 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:00:33.711 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T23:00:33.711 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:00:33.712 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:00:33.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:00:33.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:00:33.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:00:33.713 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:00:33.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:00:33.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:00:33.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T23:00:33.734 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:00:33.735 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:33.735 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:00:33.735 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T23:00:33.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:00:33.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:00:33.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:32: run: for func in $funcs 2026-03-08T23:00:33.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:33: run: setup td/test-erasure-eio 2026-03-08T23:00:33.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-eio 2026-03-08T23:00:33.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-eio 2026-03-08T23:00:33.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T23:00:33.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:00:33.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T23:00:33.736 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:00:33.736 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:00:33.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:00:33.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:00:33.736 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:00:33.738 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:00:33.739 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:00:33.740 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:00:33.741 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:00:33.742 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T23:00:33.742 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:00:33.742 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:00:33.743 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:00:33.743 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:00:33.743 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:00:33.743 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:00:33.744 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:00:33.745 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:00:33.745 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T23:00:33.746 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:00:33.746 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:33.746 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:00:33.746 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T23:00:33.747 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:00:33.747 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:00:33.747 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-eio 2026-03-08T23:00:33.748 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:00:33.748 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:33.748 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:00:33.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.102080 2026-03-08T23:00:33.749 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:00:33.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:00:33.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:00:33.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-eio 1' TERM HUP INT 2026-03-08T23:00:33.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:34: run: run_mon td/test-erasure-eio a 2026-03-08T23:00:33.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-eio 2026-03-08T23:00:33.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:00:33.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:00:33.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:00:33.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-eio/a 2026-03-08T23:00:33.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-eio/a --run-dir=td/test-erasure-eio 2026-03-08T23:00:33.774 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:00:33.774 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:00:33.774 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:00:33.774 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:00:33.774 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:33.774 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:00:33.774 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:00:33.775 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-eio/a '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-eio/log --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T23:00:33.804 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:00:33.805 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:00:33.805 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:00:33.805 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:00:33.805 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:00:33.806 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:00:33.807 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:00:33.807 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:00:33.807 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:00:33.807 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:00:33.807 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:33.808 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:00:33.808 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T23:00:33.808 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:00:33.808 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get fsid 2026-03-08T23:00:33.864 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:00:33.864 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:00:33.864 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:00:33.864 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:00:33.864 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:00:33.864 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:00:33.864 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:00:33.864 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:00:33.865 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:00:33.865 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:33.865 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:00:33.865 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T23:00:33.866 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:00:33.866 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get mon_host 2026-03-08T23:00:33.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:35: run: run_mgr td/test-erasure-eio x 2026-03-08T23:00:33.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-eio 2026-03-08T23:00:33.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:00:33.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:00:33.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:00:33.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-eio/x 2026-03-08T23:00:33.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:00:34.043 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:00:34.043 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:00:34.043 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:00:34.043 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:00:34.043 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:34.043 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:00:34.044 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:00:34.044 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:00:34.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-eio/x '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:00:34.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:36: run: create_pool rbd 4 2026-03-08T23:00:34.065 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T23:00:34.190 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd' already exists 2026-03-08T23:00:34.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:00:35.205 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: get_asok_path mon.a 2026-03-08T23:00:35.205 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:00:35.205 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:00:35.205 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:00:35.205 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:35.205 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:00:35.206 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T23:00:35.206 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: CEPH_ARGS= 2026-03-08T23:00:35.206 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-mon.a.asok log flush 2026-03-08T23:00:35.252 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T23:00:35.257 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:40: run: grep 'load: jerasure.*lrc' td/test-erasure-eio/mon.a.log 2026-03-08T23:00:35.258 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T23:00:33.798+0000 7fa04d20dd80 0 load: jerasure load: lrc 2026-03-08T23:00:35.259 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:41: run: TEST_rados_get_with_subreadall_eio_shard_0 td/test-erasure-eio 2026-03-08T23:00:35.259 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:330: TEST_rados_get_with_subreadall_eio_shard_0: local dir=td/test-erasure-eio 2026-03-08T23:00:35.259 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:331: TEST_rados_get_with_subreadall_eio_shard_0: local shard_id=0 2026-03-08T23:00:35.259 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:333: TEST_rados_get_with_subreadall_eio_shard_0: setup_osds 4 2026-03-08T23:00:35.259 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:47: setup_osds: local count=4 2026-03-08T23:00:35.259 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:48: setup_osds: shift 2026-03-08T23:00:35.259 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: expr 4 - 1 2026-03-08T23:00:35.260 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: seq 0 3 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 0 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/0 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/0' 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/0/journal' 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:00:35.262 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:00:35.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:00:35.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:00:35.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:00:35.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:00:35.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T23:00:35.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T23:00:35.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:00:35.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:00:35.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:00:35.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:00:35.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:00:35.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:00:35.263 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/0 2026-03-08T23:00:35.265 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:00:35.266 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=3d9238b5-421d-4146-9d13-1d901f0267c1 2026-03-08T23:00:35.266 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 3d9238b5-421d-4146-9d13-1d901f0267c1' 2026-03-08T23:00:35.266 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 3d9238b5-421d-4146-9d13-1d901f0267c1 2026-03-08T23:00:35.266 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:00:35.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCT/61p1BW/EBAAg3b/p/7dPwXJ8WbAWkjEpA== 2026-03-08T23:00:35.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCT/61p1BW/EBAAg3b/p/7dPwXJ8WbAWkjEpA=="}' 2026-03-08T23:00:35.280 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 3d9238b5-421d-4146-9d13-1d901f0267c1 -i td/test-erasure-eio/0/new.json 2026-03-08T23:00:35.414 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T23:00:35.427 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/0/new.json 2026-03-08T23:00:35.428 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCT/61p1BW/EBAAg3b/p/7dPwXJ8WbAWkjEpA== --osd-uuid 3d9238b5-421d-4146-9d13-1d901f0267c1 2026-03-08T23:00:35.450 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:35.451+0000 7f7f4161d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:35.455 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:35.458+0000 7f7f4161d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:35.459 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:35.460+0000 7f7f4161d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:35.459 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:35.461+0000 7f7f4161d780 -1 bdev(0x5573de189c00 td/test-erasure-eio/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:00:35.459 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:35.461+0000 7f7f4161d780 -1 bluestore(td/test-erasure-eio/0) _read_fsid unparsable uuid 2026-03-08T23:00:37.815 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/0/keyring 2026-03-08T23:00:37.815 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:00:37.816 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T23:00:37.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:00:37.817 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:00:38.147 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T23:00:38.147 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:00:38.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:00:38.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:00:38.149 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:00:38.151 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:00:38.170 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:38.171+0000 7f1079d4d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:38.176 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:38.178+0000 7f1079d4d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:38.178 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:38.179+0000 7f1079d4d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:38.411 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:00:38.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:00:38.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:00:38.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:00:38.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:00:38.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:38.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:00:38.412 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T23:00:38.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:38.412 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:00:38.646 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:39.012 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:39.014+0000 7f1079d4d780 -1 Falling back to public interface 2026-03-08T23:00:39.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:39.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:39.647 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:00:39.647 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T23:00:39.648 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:39.648 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:00:39.871 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:39.873+0000 7f1079d4d780 -1 osd.0 0 log_to_monitors true 2026-03-08T23:00:39.883 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:40.886 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T23:00:40.886 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:40.886 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:40.886 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:00:40.886 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:40.886 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:00:41.145 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 6 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/56233683,v1:127.0.0.1:6803/56233683] [v2:127.0.0.1:6804/56233683,v1:127.0.0.1:6805/56233683] exists,up 3d9238b5-421d-4146-9d13-1d901f0267c1 2026-03-08T23:00:41.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:00:41.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:00:41.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:00:41.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T23:00:41.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 1 2026-03-08T23:00:41.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T23:00:41.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:00:41.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:00:41.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:00:41.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T23:00:41.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T23:00:41.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:00:41.146 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:00:41.147 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:00:41.147 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T23:00:41.147 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T23:00:41.147 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:00:41.147 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:00:41.147 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T23:00:41.147 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:00:41.147 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:00:41.147 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:00:41.147 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:00:41.147 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:41.147 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:00:41.147 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:00:41.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:00:41.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:00:41.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:00:41.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:00:41.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T23:00:41.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T23:00:41.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:00:41.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:00:41.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:00:41.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:00:41.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:00:41.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:00:41.148 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T23:00:41.150 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:00:41.151 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=50a5898d-29f4-496c-b156-569b086abaf0 2026-03-08T23:00:41.151 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 50a5898d-29f4-496c-b156-569b086abaf0' 2026-03-08T23:00:41.151 INFO:tasks.workunit.client.0.vm04.stdout:add osd1 50a5898d-29f4-496c-b156-569b086abaf0 2026-03-08T23:00:41.151 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:00:41.166 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCZ/61plLb9CRAA8RuinhfsTDguKGt9PRKqMQ== 2026-03-08T23:00:41.166 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCZ/61plLb9CRAA8RuinhfsTDguKGt9PRKqMQ=="}' 2026-03-08T23:00:41.166 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 50a5898d-29f4-496c-b156-569b086abaf0 -i td/test-erasure-eio/1/new.json 2026-03-08T23:00:41.414 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T23:00:41.429 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/1/new.json 2026-03-08T23:00:41.430 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCZ/61plLb9CRAA8RuinhfsTDguKGt9PRKqMQ== --osd-uuid 50a5898d-29f4-496c-b156-569b086abaf0 2026-03-08T23:00:41.452 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:41.454+0000 7f0ba1392780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:41.454 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:41.457+0000 7f0ba1392780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:41.456 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:41.458+0000 7f0ba1392780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:41.456 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:41.458+0000 7f0ba1392780 -1 bdev(0x55cf6d169c00 td/test-erasure-eio/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:00:41.456 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:41.458+0000 7f0ba1392780 -1 bluestore(td/test-erasure-eio/1) _read_fsid unparsable uuid 2026-03-08T23:00:43.595 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/1/keyring 2026-03-08T23:00:43.595 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:00:43.596 INFO:tasks.workunit.client.0.vm04.stdout:adding osd1 key to auth repository 2026-03-08T23:00:43.596 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:00:43.596 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:00:43.890 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:00:43.890 INFO:tasks.workunit.client.0.vm04.stdout:start osd.1 2026-03-08T23:00:43.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:00:43.891 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:00:43.892 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:00:43.894 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:00:43.911 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:43.913+0000 7f96d4d21780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:43.920 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:43.922+0000 7f96d4d21780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:43.922 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:43.923+0000 7f96d4d21780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:44.133 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:00:44.133 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:00:44.133 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:00:44.133 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:00:44.133 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:00:44.133 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:44.133 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:00:44.133 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T23:00:44.133 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:44.133 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:44.362 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:45.008 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:45.010+0000 7f96d4d21780 -1 Falling back to public interface 2026-03-08T23:00:45.364 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T23:00:45.364 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:45.364 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:45.364 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:00:45.364 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:45.364 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:45.599 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:45.851 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:45.853+0000 7f96d4d21780 -1 osd.1 0 log_to_monitors true 2026-03-08T23:00:46.601 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T23:00:46.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:46.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:46.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:00:46.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:46.602 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:46.951 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:47.954 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T23:00:47.954 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:47.954 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:47.954 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:00:47.954 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:47.954 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:48.203 INFO:tasks.workunit.client.0.vm04.stdout:osd.1 up in weight 1 up_from 13 up_thru 13 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/4038087079,v1:127.0.0.1:6811/4038087079] [v2:127.0.0.1:6812/4038087079,v1:127.0.0.1:6813/4038087079] exists,up 50a5898d-29f4-496c-b156-569b086abaf0 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 2 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:00:48.204 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:00:48.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:00:48.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:00:48.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:00:48.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:00:48.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T23:00:48.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T23:00:48.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:00:48.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:00:48.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:00:48.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:00:48.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:00:48.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:00:48.205 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T23:00:48.206 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:00:48.207 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=1daccd25-e18e-41de-8f27-7eb25c005316 2026-03-08T23:00:48.207 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 1daccd25-e18e-41de-8f27-7eb25c005316' 2026-03-08T23:00:48.207 INFO:tasks.workunit.client.0.vm04.stdout:add osd2 1daccd25-e18e-41de-8f27-7eb25c005316 2026-03-08T23:00:48.207 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:00:48.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCg/61pysU9DRAAm0GQurS0c0tOFvCccRuDZA== 2026-03-08T23:00:48.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCg/61pysU9DRAAm0GQurS0c0tOFvCccRuDZA=="}' 2026-03-08T23:00:48.220 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 1daccd25-e18e-41de-8f27-7eb25c005316 -i td/test-erasure-eio/2/new.json 2026-03-08T23:00:48.491 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T23:00:48.504 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/2/new.json 2026-03-08T23:00:48.505 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCg/61pysU9DRAAm0GQurS0c0tOFvCccRuDZA== --osd-uuid 1daccd25-e18e-41de-8f27-7eb25c005316 2026-03-08T23:00:48.525 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:48.527+0000 7f521c42c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:48.527 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:48.529+0000 7f521c42c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:48.528 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:48.530+0000 7f521c42c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:48.529 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:48.531+0000 7f521c42c780 -1 bdev(0x5570d18ebc00 td/test-erasure-eio/2/block) open stat got: (1) Operation not permitted 2026-03-08T23:00:48.529 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:48.531+0000 7f521c42c780 -1 bluestore(td/test-erasure-eio/2) _read_fsid unparsable uuid 2026-03-08T23:00:51.255 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/2/keyring 2026-03-08T23:00:51.255 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:00:51.256 INFO:tasks.workunit.client.0.vm04.stdout:adding osd2 key to auth repository 2026-03-08T23:00:51.256 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T23:00:51.256 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:00:51.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T23:00:51.559 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T23:00:51.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:00:51.559 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:00:51.560 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:00:51.562 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:00:51.577 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:51.579+0000 7f668a61f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:51.584 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:51.586+0000 7f668a61f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:51.585 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:51.587+0000 7f668a61f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:51.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T23:00:51.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:00:51.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:00:51.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:00:51.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:00:51.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:51.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:00:51.781 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T23:00:51.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:51.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:00:52.007 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:52.683 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:52.685+0000 7f668a61f780 -1 Falling back to public interface 2026-03-08T23:00:53.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:53.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:53.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:00:53.008 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T23:00:53.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:53.009 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:00:53.248 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:54.017 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:54.019+0000 7f668a61f780 -1 osd.2 0 log_to_monitors true 2026-03-08T23:00:54.250 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:54.250 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:54.250 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:00:54.250 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T23:00:54.251 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:54.251 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:00:54.489 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:55.492 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T23:00:55.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:55.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:55.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:00:55.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:55.492 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:00:55.768 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up in weight 1 up_from 19 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/2095733706,v1:127.0.0.1:6819/2095733706] [v2:127.0.0.1:6820/2095733706,v1:127.0.0.1:6821/2095733706] exists,up 1daccd25-e18e-41de-8f27-7eb25c005316 2026-03-08T23:00:55.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:00:55.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:00:55.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:00:55.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T23:00:55.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 3 2026-03-08T23:00:55.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T23:00:55.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:00:55.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T23:00:55.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:00:55.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/3 2026-03-08T23:00:55.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T23:00:55.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:00:55.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:00:55.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:00:55.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/3' 2026-03-08T23:00:55.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/3/journal' 2026-03-08T23:00:55.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:00:55.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:00:55.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T23:00:55.769 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:00:55.770 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:00:55.770 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:00:55.770 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:00:55.770 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:55.770 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:00:55.770 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:00:55.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:00:55.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:00:55.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:00:55.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:00:55.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T23:00:55.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T23:00:55.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:00:55.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:00:55.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:00:55.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:00:55.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:00:55.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:00:55.771 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/3 2026-03-08T23:00:55.772 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:00:55.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=1aa25955-81d2-48e7-a69d-83c9bc9e59ad 2026-03-08T23:00:55.773 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 1aa25955-81d2-48e7-a69d-83c9bc9e59ad' 2026-03-08T23:00:55.773 INFO:tasks.workunit.client.0.vm04.stdout:add osd3 1aa25955-81d2-48e7-a69d-83c9bc9e59ad 2026-03-08T23:00:55.773 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:00:55.792 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCn/61p+Ew+LxAARXTN7GFw/IsJVIuXUIB07A== 2026-03-08T23:00:55.792 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCn/61p+Ew+LxAARXTN7GFw/IsJVIuXUIB07A=="}' 2026-03-08T23:00:55.792 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 1aa25955-81d2-48e7-a69d-83c9bc9e59ad -i td/test-erasure-eio/3/new.json 2026-03-08T23:00:56.117 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T23:00:56.128 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/3/new.json 2026-03-08T23:00:56.129 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCn/61p+Ew+LxAARXTN7GFw/IsJVIuXUIB07A== --osd-uuid 1aa25955-81d2-48e7-a69d-83c9bc9e59ad 2026-03-08T23:00:56.150 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:56.152+0000 7f4ffb9d4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:56.152 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:56.154+0000 7f4ffb9d4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:56.153 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:56.155+0000 7f4ffb9d4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:56.154 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:56.156+0000 7f4ffb9d4780 -1 bdev(0x56536fc35c00 td/test-erasure-eio/3/block) open stat got: (1) Operation not permitted 2026-03-08T23:00:56.154 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:56.156+0000 7f4ffb9d4780 -1 bluestore(td/test-erasure-eio/3) _read_fsid unparsable uuid 2026-03-08T23:00:58.578 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/3/keyring 2026-03-08T23:00:58.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:00:58.579 INFO:tasks.workunit.client.0.vm04.stdout:adding osd3 key to auth repository 2026-03-08T23:00:58.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T23:00:58.579 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:00:58.938 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T23:00:58.939 INFO:tasks.workunit.client.0.vm04.stdout:start osd.3 2026-03-08T23:00:58.939 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:00:58.939 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:00:58.940 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:00:58.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:00:58.963 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:58.963+0000 7eff1082f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:58.965 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:58.968+0000 7eff1082f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:58.968 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:58.969+0000 7eff1082f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:59.166 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T23:00:59.166 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:00:59.166 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T23:00:59.166 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T23:00:59.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:00:59.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:00:59.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:59.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:00:59.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:59.167 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:00:59.393 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:59.521 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:00:59.523+0000 7eff1082f780 -1 Falling back to public interface 2026-03-08T23:01:00.391 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:00.393+0000 7eff1082f780 -1 osd.3 0 log_to_monitors true 2026-03-08T23:01:00.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:00.394 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:00.395 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T23:01:00.395 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:01:00.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:00.396 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:01:00.656 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:01.657 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:01.657 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:01.657 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:01:01.657 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T23:01:01.658 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:01.658 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:01:02.035 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:03.036 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T23:01:03.036 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:03.036 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:03.036 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:01:03.037 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:03.037 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:01:03.281 INFO:tasks.workunit.client.0.vm04.stdout:osd.3 up in weight 1 up_from 26 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/1566009740,v1:127.0.0.1:6827/1566009740] [v2:127.0.0.1:6828/1566009740,v1:127.0.0.1:6829/1566009740] exists,up 1aa25955-81d2-48e7-a69d-83c9bc9e59ad 2026-03-08T23:01:03.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:01:03.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:01:03.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:01:03.282 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: get_asok_path osd.0 2026-03-08T23:01:03.282 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T23:01:03.282 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T23:01:03.282 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:01:03.282 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:03.282 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:01:03.283 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.0.asok 2026-03-08T23:01:03.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: CEPH_ARGS= 2026-03-08T23:01:03.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.0.asok log flush 2026-03-08T23:01:03.345 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T23:01:03.350 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:56: setup_osds: grep 'load: jerasure.*lrc' td/test-erasure-eio/osd.0.log 2026-03-08T23:01:03.352 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T23:00:39.018+0000 7f1079d4d780 0 load: jerasure load: lrc 2026-03-08T23:01:03.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:335: TEST_rados_get_with_subreadall_eio_shard_0: local poolname=pool-jerasure 2026-03-08T23:01:03.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:336: TEST_rados_get_with_subreadall_eio_shard_0: create_erasure_coded_pool pool-jerasure 2 1 2026-03-08T23:01:03.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:67: create_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T23:01:03.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:68: create_erasure_coded_pool: shift 2026-03-08T23:01:03.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:69: create_erasure_coded_pool: local k=2 2026-03-08T23:01:03.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:70: create_erasure_coded_pool: shift 2026-03-08T23:01:03.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:71: create_erasure_coded_pool: local m=1 2026-03-08T23:01:03.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:72: create_erasure_coded_pool: shift 2026-03-08T23:01:03.353 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:74: create_erasure_coded_pool: ceph osd erasure-code-profile set myprofile plugin=jerasure k=2 m=1 crush-failure-domain=osd 2026-03-08T23:01:03.639 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:78: create_erasure_coded_pool: create_pool pool-jerasure 1 1 erasure myprofile 2026-03-08T23:01:03.639 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create pool-jerasure 1 1 erasure myprofile 2026-03-08T23:01:04.041 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' already exists 2026-03-08T23:01:04.052 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:01:05.054 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:80: create_erasure_coded_pool: wait_for_clean 2026-03-08T23:01:05.054 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:01:05.054 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:01:05.054 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:01:05.054 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:01:05.054 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:01:05.054 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:01:05.055 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:01:05.055 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:01:05.055 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:01:05.134 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:01:05.134 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:01:05.134 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:01:05.134 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:01:05.135 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:01:05.135 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:01:05.399 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:01:05.399 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T23:01:05.399 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T23:01:05.399 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T23:01:05.399 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:01:05.399 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:05.399 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:01:05.483 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803783 2026-03-08T23:01:05.483 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803783 2026-03-08T23:01:05.483 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783' 2026-03-08T23:01:05.483 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:05.484 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:01:05.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574854 2026-03-08T23:01:05.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574854 2026-03-08T23:01:05.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783 1-55834574854' 2026-03-08T23:01:05.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:05.559 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:01:05.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378628 2026-03-08T23:01:05.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378628 2026-03-08T23:01:05.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783 1-55834574854 2-81604378628' 2026-03-08T23:01:05.635 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:05.635 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:01:05.712 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=111669149699 2026-03-08T23:01:05.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 111669149699 2026-03-08T23:01:05.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783 1-55834574854 2-81604378628 3-111669149699' 2026-03-08T23:01:05.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:05.713 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803783 2026-03-08T23:01:05.713 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:05.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:01:05.714 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803783 2026-03-08T23:01:05.714 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:05.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803783 2026-03-08T23:01:05.715 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803783' 2026-03-08T23:01:05.715 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803783 2026-03-08T23:01:05.715 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:01:05.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803781 -lt 25769803783 2026-03-08T23:01:05.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:01:06.939 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:01:06.939 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:01:07.180 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803783 -lt 25769803783 2026-03-08T23:01:07.180 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:07.181 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574854 2026-03-08T23:01:07.181 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:07.183 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:01:07.184 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574854 2026-03-08T23:01:07.184 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:07.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574854 2026-03-08T23:01:07.185 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574854' 2026-03-08T23:01:07.185 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 55834574854 2026-03-08T23:01:07.186 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:01:07.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574854 -lt 55834574854 2026-03-08T23:01:07.415 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:07.416 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378628 2026-03-08T23:01:07.416 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:07.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:01:07.417 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378628 2026-03-08T23:01:07.417 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:07.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378628 2026-03-08T23:01:07.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378628' 2026-03-08T23:01:07.418 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 81604378628 2026-03-08T23:01:07.418 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:01:07.651 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378629 -lt 81604378628 2026-03-08T23:01:07.651 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:07.651 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-111669149699 2026-03-08T23:01:07.652 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:07.652 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:01:07.653 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-111669149699 2026-03-08T23:01:07.653 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:07.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=111669149699 2026-03-08T23:01:07.654 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 111669149699' 2026-03-08T23:01:07.654 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 111669149699 2026-03-08T23:01:07.654 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:01:07.880 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 111669149699 -lt 111669149699 2026-03-08T23:01:07.880 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:01:07.881 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:01:07.881 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:01:08.193 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:01:08.193 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:01:08.193 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:01:08.193 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:01:08.193 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:01:08.193 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:01:08.193 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:01:08.193 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:01:08.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:01:08.417 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:01:08.417 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:01:08.417 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:01:08.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:01:08.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:01:08.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:01:08.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:338: TEST_rados_get_with_subreadall_eio_shard_0: rados_put_get_data eio td/test-erasure-eio 0 recovery 2026-03-08T23:01:08.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:147: rados_put_get_data: local inject=eio 2026-03-08T23:01:08.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:148: rados_put_get_data: shift 2026-03-08T23:01:08.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:149: rados_put_get_data: local dir=td/test-erasure-eio 2026-03-08T23:01:08.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:150: rados_put_get_data: shift 2026-03-08T23:01:08.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:151: rados_put_get_data: local shard_id=0 2026-03-08T23:01:08.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:152: rados_put_get_data: shift 2026-03-08T23:01:08.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:153: rados_put_get_data: local arg=recovery 2026-03-08T23:01:08.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:157: rados_put_get_data: local poolname=pool-jerasure 2026-03-08T23:01:08.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:158: rados_put_get_data: local objname=obj-eio-102080-0 2026-03-08T23:01:08.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:159: rados_put_get_data: rados_put td/test-erasure-eio pool-jerasure obj-eio-102080-0 2026-03-08T23:01:08.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T23:01:08.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T23:01:08.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=obj-eio-102080-0 2026-03-08T23:01:08.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T23:01:08.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T23:01:08.713 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T23:01:08.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T23:01:08.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T23:01:08.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T23:01:08.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T23:01:08.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T23:01:08.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put obj-eio-102080-0 td/test-erasure-eio/ORIGINAL 2026-03-08T23:01:08.746 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:160: rados_put_get_data: inject_eio ec data pool-jerasure obj-eio-102080-0 td/test-erasure-eio 0 2026-03-08T23:01:08.746 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=ec 2026-03-08T23:01:08.746 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T23:01:08.746 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=data 2026-03-08T23:01:08.746 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T23:01:08.746 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=pool-jerasure 2026-03-08T23:01:08.746 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T23:01:08.746 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=obj-eio-102080-0 2026-03-08T23:01:08.746 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T23:01:08.746 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/test-erasure-eio 2026-03-08T23:01:08.746 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T23:01:08.746 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=0 2026-03-08T23:01:08.746 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T23:01:08.746 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds pool-jerasure obj-eio-102080-0 2026-03-08T23:01:08.746 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T23:01:08.746 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-eio-102080-0 2026-03-08T23:01:08.747 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-eio-102080-0 2026-03-08T23:01:08.747 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:01:08.990 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T23:01:08.991 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T23:01:08.991 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T23:01:08.991 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T23:01:08.991 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('3' '1' '2') 2026-03-08T23:01:08.991 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T23:01:08.991 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=3 2026-03-08T23:01:08.991 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' ec '!=' ec ']' 2026-03-08T23:01:08.991 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/test-erasure-eio/3/type 2026-03-08T23:01:08.992 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T23:01:08.992 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 3 bluestore_debug_inject_read_err true 2026-03-08T23:01:08.992 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T23:01:08.993 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=3 2026-03-08T23:01:08.993 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T23:01:08.993 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T23:01:08.993 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T23:01:08.993 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.3 2026-03-08T23:01:08.993 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.3 2026-03-08T23:01:08.993 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.3 ']' 2026-03-08T23:01:08.994 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:01:08.994 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:08.994 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:01:08.994 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.3.asok 2026-03-08T23:01:08.994 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.102080/ceph-osd.3.asok config set bluestore_debug_inject_read_err true 2026-03-08T23:01:09.061 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T23:01:09.061 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T23:01:09.061 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T23:01:09.062 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.3 2026-03-08T23:01:09.062 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.3 2026-03-08T23:01:09.062 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.3 ']' 2026-03-08T23:01:09.062 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:01:09.062 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:09.062 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:01:09.062 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.3.asok 2026-03-08T23:01:09.063 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T23:01:09.063 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.3.asok injectdataerr pool-jerasure obj-eio-102080-0 0 2026-03-08T23:01:09.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:161: rados_put_get_data: rados_get td/test-erasure-eio pool-jerasure obj-eio-102080-0 2026-03-08T23:01:09.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T23:01:09.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T23:01:09.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=obj-eio-102080-0 2026-03-08T23:01:09.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=ok 2026-03-08T23:01:09.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' ok = fail ']' 2026-03-08T23:01:09.130 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:120: rados_get: rados --pool pool-jerasure get obj-eio-102080-0 td/test-erasure-eio/COPY 2026-03-08T23:01:09.164 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:121: rados_get: diff td/test-erasure-eio/ORIGINAL td/test-erasure-eio/COPY 2026-03-08T23:01:09.165 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:122: rados_get: rm td/test-erasure-eio/COPY 2026-03-08T23:01:09.166 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:163: rados_put_get_data: '[' recovery = recovery ']' 2026-03-08T23:01:09.167 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:170: rados_put_get_data: get_osds pool-jerasure obj-eio-102080-0 2026-03-08T23:01:09.167 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T23:01:09.167 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-eio-102080-0 2026-03-08T23:01:09.167 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:01:09.167 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-eio-102080-0 2026-03-08T23:01:09.408 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T23:01:09.408 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T23:01:09.408 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T23:01:09.408 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T23:01:09.408 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:170: rados_put_get_data: initial_osds=('3' '1' '2') 2026-03-08T23:01:09.408 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:170: rados_put_get_data: local -a initial_osds 2026-03-08T23:01:09.408 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:171: rados_put_get_data: local last_osd=2 2026-03-08T23:01:09.408 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:173: rados_put_get_data: kill_daemons td/test-erasure-eio TERM osd.2 2026-03-08T23:01:09.408 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:01:09.408 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:01:09.409 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:01:09.409 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:01:09.409 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:01:09.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:01:09.513 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:174: rados_put_get_data: ceph osd out 2 2026-03-08T23:01:09.773 INFO:tasks.workunit.client.0.vm04.stderr:osd.2 is already out. 2026-03-08T23:01:09.783 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:175: rados_put_get_data: get_osds pool-jerasure obj-eio-102080-0 2026-03-08T23:01:09.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:175: rados_put_get_data: grep '\<2\>' 2026-03-08T23:01:09.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T23:01:09.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-eio-102080-0 2026-03-08T23:01:09.784 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-eio-102080-0 2026-03-08T23:01:09.785 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:01:10.027 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T23:01:10.027 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T23:01:10.027 INFO:tasks.workunit.client.0.vm04.stderr:0' 2026-03-08T23:01:10.027 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 0 2026-03-08T23:01:10.027 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:176: rados_put_get_data: ceph osd in 2 2026-03-08T23:01:10.440 INFO:tasks.workunit.client.0.vm04.stderr:osd.2 is already in. 2026-03-08T23:01:10.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:177: rados_put_get_data: activate_osd td/test-erasure-eio 2 2026-03-08T23:01:10.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T23:01:10.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:01:10.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T23:01:10.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:01:10.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T23:01:10.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T23:01:10.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:01:10.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:01:10.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:01:10.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T23:01:10.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T23:01:10.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:01:10.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:01:10.452 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T23:01:10.453 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:01:10.453 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:01:10.453 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:01:10.453 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:01:10.453 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:10.453 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:01:10.453 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:01:10.453 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:01:10.453 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:01:10.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T23:01:10.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T23:01:10.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:01:10.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:01:10.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:01:10.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:01:10.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:01:10.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:01:10.454 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T23:01:10.455 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T23:01:10.455 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T23:01:10.455 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:01:10.455 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/2/whoami 2026-03-08T23:01:10.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T23:01:10.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:01:10.457 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:01:10.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:01:10.474 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:10.476+0000 7f80f4a1e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:10.476 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:10.478+0000 7f80f4a1e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:10.477 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:10.479+0000 7f80f4a1e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:10.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T23:01:10.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:01:10.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:01:10.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:01:10.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:01:10.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:10.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:01:10.937 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T23:01:10.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:10.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:01:11.233 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:11.308 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:11.309+0000 7f80f4a1e780 -1 Falling back to public interface 2026-03-08T23:01:12.183 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:12.184+0000 7f80f4a1e780 -1 osd.2 33 log_to_monitors true 2026-03-08T23:01:12.236 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T23:01:12.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:12.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:12.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:01:12.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:12.236 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:01:12.627 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:13.629 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T23:01:13.629 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:13.629 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:13.629 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:01:13.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:13.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:01:13.898 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:14.900 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T23:01:14.901 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:14.901 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:14.901 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:01:14.901 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:14.901 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:01:15.140 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:16.142 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-08T23:01:16.143 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:16.143 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:16.143 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:01:16.143 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:16.143 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:01:16.390 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up in weight 1 up_from 43 up_thru 19 down_at 34 last_clean_interval [19,33) [v2:127.0.0.1:6818/1600817453,v1:127.0.0.1:6819/1600817453] [v2:127.0.0.1:6820/1600817453,v1:127.0.0.1:6821/1600817453] exists,up 1daccd25-e18e-41de-8f27-7eb25c005316 2026-03-08T23:01:16.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:01:16.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:01:16.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:01:16.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:178: rados_put_get_data: wait_for_clean 2026-03-08T23:01:16.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:01:16.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:01:16.391 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:01:16.391 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:01:16.392 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:01:16.392 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:01:16.392 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:01:16.392 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:01:16.392 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:01:16.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:01:16.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:01:16.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:01:16.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:01:16.473 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:01:16.473 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:01:16.704 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:01:16.704 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T23:01:16.704 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T23:01:16.704 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T23:01:16.704 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:01:16.704 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:16.704 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:01:16.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803787 2026-03-08T23:01:16.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803787 2026-03-08T23:01:16.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787' 2026-03-08T23:01:16.781 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:16.781 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:01:16.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574858 2026-03-08T23:01:16.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574858 2026-03-08T23:01:16.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574858' 2026-03-08T23:01:16.857 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:16.857 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:01:16.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=184683593731 2026-03-08T23:01:16.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 184683593731 2026-03-08T23:01:16.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574858 2-184683593731' 2026-03-08T23:01:16.929 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:16.929 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:01:17.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=111669149703 2026-03-08T23:01:17.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 111669149703 2026-03-08T23:01:17.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574858 2-184683593731 3-111669149703' 2026-03-08T23:01:17.008 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:17.009 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803787 2026-03-08T23:01:17.009 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:17.010 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:01:17.011 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803787 2026-03-08T23:01:17.011 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:17.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803787 2026-03-08T23:01:17.012 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803787' 2026-03-08T23:01:17.012 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803787 2026-03-08T23:01:17.012 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:01:17.238 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803785 -lt 25769803787 2026-03-08T23:01:17.238 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:01:18.239 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:01:18.239 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:01:18.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803785 -lt 25769803787 2026-03-08T23:01:18.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:01:19.467 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:01:19.467 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:01:19.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803788 -lt 25769803787 2026-03-08T23:01:19.699 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:19.700 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574858 2026-03-08T23:01:19.700 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:19.701 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:01:19.702 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574858 2026-03-08T23:01:19.702 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:19.703 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574858 2026-03-08T23:01:19.703 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574858' 2026-03-08T23:01:19.703 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 55834574858 2026-03-08T23:01:19.703 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:01:19.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574859 -lt 55834574858 2026-03-08T23:01:19.937 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:19.938 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-184683593731 2026-03-08T23:01:19.938 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:19.939 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:01:19.939 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-184683593731 2026-03-08T23:01:19.940 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:19.940 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=184683593731 2026-03-08T23:01:19.940 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 184683593731' 2026-03-08T23:01:19.941 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 184683593731 2026-03-08T23:01:19.941 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:01:20.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 184683593732 -lt 184683593731 2026-03-08T23:01:20.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:20.176 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-111669149703 2026-03-08T23:01:20.177 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:20.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:01:20.181 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-111669149703 2026-03-08T23:01:20.181 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:20.181 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=111669149703 2026-03-08T23:01:20.181 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 111669149703' 2026-03-08T23:01:20.181 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 111669149703 2026-03-08T23:01:20.181 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:01:20.443 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 111669149703 -lt 111669149703 2026-03-08T23:01:20.444 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:01:20.444 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:01:20.444 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:01:20.818 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:01:20.819 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:01:20.819 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:01:20.819 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:01:20.819 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:01:20.819 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:01:20.819 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:01:20.819 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:01:21.044 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:01:21.044 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:01:21.045 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:01:21.045 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:01:21.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:01:21.349 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:01:21.349 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:01:21.349 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:340: TEST_rados_get_with_subreadall_eio_shard_0: delete_erasure_coded_pool pool-jerasure 2026-03-08T23:01:21.349 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:84: delete_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T23:01:21.349 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:85: delete_erasure_coded_pool: ceph osd pool delete pool-jerasure pool-jerasure --yes-i-really-really-mean-it 2026-03-08T23:01:21.732 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' does not exist 2026-03-08T23:01:21.743 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:86: delete_erasure_coded_pool: ceph osd erasure-code-profile rm myprofile 2026-03-08T23:01:22.026 INFO:tasks.workunit.client.0.vm04.stderr:erasure-code-profile myprofile does not exist 2026-03-08T23:01:22.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:42: run: teardown td/test-erasure-eio 2026-03-08T23:01:22.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T23:01:22.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:01:22.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T23:01:22.038 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:01:22.038 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:01:22.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:01:22.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:01:22.038 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:01:22.172 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:01:22.172 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:01:22.173 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:01:22.173 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:01:22.174 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T23:01:22.174 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:01:22.175 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:01:22.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:01:22.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:01:22.176 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:01:22.176 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:01:22.177 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:01:22.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:01:22.178 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T23:01:22.201 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:01:22.202 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:22.202 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:01:22.202 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T23:01:22.203 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:01:22.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:01:22.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:32: run: for func in $funcs 2026-03-08T23:01:22.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:33: run: setup td/test-erasure-eio 2026-03-08T23:01:22.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/test-erasure-eio 2026-03-08T23:01:22.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/test-erasure-eio 2026-03-08T23:01:22.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T23:01:22.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:01:22.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T23:01:22.204 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:01:22.204 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:01:22.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:01:22.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:01:22.204 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:01:22.206 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:01:22.206 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:01:22.207 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:01:22.207 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:01:22.208 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T23:01:22.208 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:01:22.208 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:01:22.209 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:01:22.209 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:01:22.210 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:01:22.210 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:01:22.211 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:01:22.212 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:01:22.212 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T23:01:22.213 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:01:22.213 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:22.213 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:01:22.213 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T23:01:22.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:01:22.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:01:22.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/test-erasure-eio 2026-03-08T23:01:22.215 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:01:22.215 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:22.215 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:01:22.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.102080 2026-03-08T23:01:22.216 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:01:22.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:01:22.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:01:22.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/test-erasure-eio 1' TERM HUP INT 2026-03-08T23:01:22.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:34: run: run_mon td/test-erasure-eio a 2026-03-08T23:01:22.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/test-erasure-eio 2026-03-08T23:01:22.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:01:22.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:01:22.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:01:22.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/test-erasure-eio/a 2026-03-08T23:01:22.217 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/test-erasure-eio/a --run-dir=td/test-erasure-eio 2026-03-08T23:01:22.258 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:01:22.258 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:01:22.258 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:01:22.258 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:01:22.259 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:22.259 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:01:22.259 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:01:22.259 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/test-erasure-eio/a '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --mon-cluster-log-file=td/test-erasure-eio/log --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T23:01:22.308 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:01:22.309 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:01:22.309 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:01:22.309 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:01:22.309 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:01:22.310 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:01:22.310 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:01:22.310 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:01:22.310 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:01:22.310 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:01:22.310 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:22.310 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:01:22.311 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T23:01:22.311 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:01:22.311 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get fsid 2026-03-08T23:01:22.373 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:01:22.373 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:01:22.373 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:01:22.373 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:01:22.373 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:01:22.373 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:01:22.373 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:01:22.373 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:01:22.373 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:01:22.373 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:22.374 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:01:22.374 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T23:01:22.374 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:01:22.374 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.102080/ceph-mon.a.asok config get mon_host 2026-03-08T23:01:22.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:35: run: run_mgr td/test-erasure-eio x 2026-03-08T23:01:22.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/test-erasure-eio 2026-03-08T23:01:22.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:01:22.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:01:22.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:01:22.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/test-erasure-eio/x 2026-03-08T23:01:22.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:01:22.564 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:01:22.564 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:01:22.564 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:01:22.564 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:01:22.564 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:22.564 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:01:22.564 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:01:22.565 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:01:22.566 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/test-erasure-eio/x '--log-file=td/test-erasure-eio/$name.log' '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --run-dir=td/test-erasure-eio '--pid-file=td/test-erasure-eio/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:01:22.590 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:36: run: create_pool rbd 4 2026-03-08T23:01:22.590 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T23:01:22.715 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd' already exists 2026-03-08T23:01:22.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:01:23.727 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: get_asok_path mon.a 2026-03-08T23:01:23.727 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:01:23.727 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:01:23.727 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:01:23.727 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:23.727 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:01:23.727 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-mon.a.asok 2026-03-08T23:01:23.727 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: CEPH_ARGS= 2026-03-08T23:01:23.727 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:39: run: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-mon.a.asok log flush 2026-03-08T23:01:23.779 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T23:01:23.784 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:40: run: grep 'load: jerasure.*lrc' td/test-erasure-eio/mon.a.log 2026-03-08T23:01:23.785 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T23:01:22.283+0000 7fca83c1dd80 0 load: jerasure load: lrc 2026-03-08T23:01:23.786 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:41: run: TEST_rados_get_with_subreadall_eio_shard_1 td/test-erasure-eio 2026-03-08T23:01:23.786 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:344: TEST_rados_get_with_subreadall_eio_shard_1: local dir=td/test-erasure-eio 2026-03-08T23:01:23.786 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:345: TEST_rados_get_with_subreadall_eio_shard_1: local shard_id=1 2026-03-08T23:01:23.786 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:347: TEST_rados_get_with_subreadall_eio_shard_1: setup_osds 4 2026-03-08T23:01:23.786 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:47: setup_osds: local count=4 2026-03-08T23:01:23.786 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:48: setup_osds: shift 2026-03-08T23:01:23.786 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: expr 4 - 1 2026-03-08T23:01:23.787 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: seq 0 3 2026-03-08T23:01:23.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T23:01:23.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 0 2026-03-08T23:01:23.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T23:01:23.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:01:23.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:01:23.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:01:23.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/0 2026-03-08T23:01:23.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T23:01:23.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:01:23.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:01:23.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:01:23.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/0' 2026-03-08T23:01:23.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/0/journal' 2026-03-08T23:01:23.788 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:01:23.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:01:23.789 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T23:01:23.789 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:01:23.789 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:01:23.789 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:01:23.789 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:01:23.789 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:23.789 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:01:23.789 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:01:23.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:01:23.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:01:23.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:01:23.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:01:23.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T23:01:23.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T23:01:23.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:01:23.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:01:23.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:01:23.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:01:23.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:01:23.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:01:23.790 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/0 2026-03-08T23:01:23.791 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:01:23.791 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=fb0461ad-006d-4488-9066-c5923638db13 2026-03-08T23:01:23.792 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 fb0461ad-006d-4488-9066-c5923638db13' 2026-03-08T23:01:23.792 INFO:tasks.workunit.client.0.vm04.stdout:add osd0 fb0461ad-006d-4488-9066-c5923638db13 2026-03-08T23:01:23.792 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:01:23.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDD/61phqYRMBAAkamSZb+LdMQFQeg3oVQsMA== 2026-03-08T23:01:23.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDD/61phqYRMBAAkamSZb+LdMQFQeg3oVQsMA=="}' 2026-03-08T23:01:23.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new fb0461ad-006d-4488-9066-c5923638db13 -i td/test-erasure-eio/0/new.json 2026-03-08T23:01:23.955 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T23:01:23.964 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/0/new.json 2026-03-08T23:01:23.965 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDD/61phqYRMBAAkamSZb+LdMQFQeg3oVQsMA== --osd-uuid fb0461ad-006d-4488-9066-c5923638db13 2026-03-08T23:01:23.990 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:23.990+0000 7f680e110780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:23.996 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:23.998+0000 7f680e110780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:23.997 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:23.999+0000 7f680e110780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:23.997 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:23.999+0000 7f680e110780 -1 bdev(0x563c222e4800 td/test-erasure-eio/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:01:23.997 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:24.000+0000 7f680e110780 -1 bluestore(td/test-erasure-eio/0) _read_fsid unparsable uuid 2026-03-08T23:01:26.625 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/0/keyring 2026-03-08T23:01:26.626 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:01:26.626 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:01:26.626 INFO:tasks.workunit.client.0.vm04.stdout:adding osd0 key to auth repository 2026-03-08T23:01:26.626 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:01:26.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:01:26.922 INFO:tasks.workunit.client.0.vm04.stdout:start osd.0 2026-03-08T23:01:26.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/0 --osd-journal=td/test-erasure-eio/0/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:01:26.922 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:01:26.923 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:01:26.925 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:01:26.941 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:26.943+0000 7f5697212780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:26.977 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:26.979+0000 7f5697212780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:26.978 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:26.980+0000 7f5697212780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:27.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:01:27.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:01:27.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:01:27.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:01:27.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:01:27.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:27.153 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:01:27.154 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T23:01:27.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:27.154 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:01:27.372 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:28.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:28.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:28.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:01:28.374 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T23:01:28.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:28.374 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:01:28.555 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:28.557+0000 7f5697212780 -1 Falling back to public interface 2026-03-08T23:01:28.596 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:29.598 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:29.598 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:29.598 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:01:29.598 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T23:01:29.598 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:29.598 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:01:29.852 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:29.908 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:29.910+0000 7f5697212780 -1 osd.0 0 log_to_monitors true 2026-03-08T23:01:30.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:30.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:30.854 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:01:30.854 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T23:01:30.855 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:30.856 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:01:31.110 INFO:tasks.workunit.client.0.vm04.stdout:osd.0 up in weight 1 up_from 6 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/356764040,v1:127.0.0.1:6803/356764040] [v2:127.0.0.1:6804/356764040,v1:127.0.0.1:6805/356764040] exists,up fb0461ad-006d-4488-9066-c5923638db13 2026-03-08T23:01:31.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:01:31.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:01:31.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:01:31.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T23:01:31.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 1 2026-03-08T23:01:31.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T23:01:31.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:01:31.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:01:31.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:01:31.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/1 2026-03-08T23:01:31.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T23:01:31.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:01:31.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:01:31.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:01:31.111 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/1' 2026-03-08T23:01:31.112 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/1/journal' 2026-03-08T23:01:31.112 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:01:31.112 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:01:31.112 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T23:01:31.112 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:01:31.112 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:01:31.112 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:01:31.112 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:01:31.112 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:31.113 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:01:31.113 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:01:31.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:01:31.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:01:31.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:01:31.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:01:31.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T23:01:31.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T23:01:31.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:01:31.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:01:31.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:01:31.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:01:31.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:01:31.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:01:31.113 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/1 2026-03-08T23:01:31.113 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:01:31.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=2e24a379-c3ec-46ba-b4fc-39dc4aa110ad 2026-03-08T23:01:31.114 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 2e24a379-c3ec-46ba-b4fc-39dc4aa110ad' 2026-03-08T23:01:31.114 INFO:tasks.workunit.client.0.vm04.stdout:add osd1 2e24a379-c3ec-46ba-b4fc-39dc4aa110ad 2026-03-08T23:01:31.115 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:01:31.133 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDL/61p65blBxAAvUyFyU/wwyN+vBkeLWnNIQ== 2026-03-08T23:01:31.133 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDL/61p65blBxAAvUyFyU/wwyN+vBkeLWnNIQ=="}' 2026-03-08T23:01:31.133 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 2e24a379-c3ec-46ba-b4fc-39dc4aa110ad -i td/test-erasure-eio/1/new.json 2026-03-08T23:01:31.364 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T23:01:31.376 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/1/new.json 2026-03-08T23:01:31.377 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDL/61p65blBxAAvUyFyU/wwyN+vBkeLWnNIQ== --osd-uuid 2e24a379-c3ec-46ba-b4fc-39dc4aa110ad 2026-03-08T23:01:31.397 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:31.399+0000 7f6468e3f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:31.400 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:31.402+0000 7f6468e3f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:31.401 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:31.403+0000 7f6468e3f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:31.402 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:31.404+0000 7f6468e3f780 -1 bdev(0x56075e2cfc00 td/test-erasure-eio/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:01:31.402 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:31.404+0000 7f6468e3f780 -1 bluestore(td/test-erasure-eio/1) _read_fsid unparsable uuid 2026-03-08T23:01:34.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/1/keyring 2026-03-08T23:01:34.527 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:01:34.528 INFO:tasks.workunit.client.0.vm04.stdout:adding osd1 key to auth repository 2026-03-08T23:01:34.528 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:01:34.528 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:01:34.835 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:01:34.835 INFO:tasks.workunit.client.0.vm04.stdout:start osd.1 2026-03-08T23:01:34.836 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/1 --osd-journal=td/test-erasure-eio/1/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:01:34.836 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:01:34.837 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:01:34.838 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:01:34.856 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:34.857+0000 7f35d578a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:34.862 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:34.864+0000 7f35d578a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:34.863 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:34.865+0000 7f35d578a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:35.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:01:35.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:01:35.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:01:35.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:01:35.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:01:35.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:35.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:01:35.055 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T23:01:35.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:35.055 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:01:35.284 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:35.431 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:35.432+0000 7f35d578a780 -1 Falling back to public interface 2026-03-08T23:01:36.285 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T23:01:36.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:36.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:36.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:01:36.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:36.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:01:36.310 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:36.312+0000 7f35d578a780 -1 osd.1 0 log_to_monitors true 2026-03-08T23:01:36.552 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:37.556 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T23:01:37.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:37.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:37.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:01:37.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:01:37.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:37.802 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:38.057 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:38.059+0000 7f35d0f29640 -1 osd.1 0 waiting for initial osdmap 2026-03-08T23:01:38.804 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T23:01:38.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:38.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:38.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:01:38.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:38.805 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:01:39.028 INFO:tasks.workunit.client.0.vm04.stdout:osd.1 up in weight 1 up_from 13 up_thru 13 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/1065490223,v1:127.0.0.1:6811/1065490223] [v2:127.0.0.1:6812/1065490223,v1:127.0.0.1:6813/1065490223] exists,up 2e24a379-c3ec-46ba-b4fc-39dc4aa110ad 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 2 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:01:39.029 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:01:39.030 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:01:39.030 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:39.030 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:01:39.030 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:01:39.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:01:39.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:01:39.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:01:39.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:01:39.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T23:01:39.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T23:01:39.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:01:39.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:01:39.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:01:39.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:01:39.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:01:39.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:01:39.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T23:01:39.032 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:01:39.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=7577b3a8-ca1c-4d06-a3f8-651a92dea82d 2026-03-08T23:01:39.033 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 7577b3a8-ca1c-4d06-a3f8-651a92dea82d' 2026-03-08T23:01:39.033 INFO:tasks.workunit.client.0.vm04.stdout:add osd2 7577b3a8-ca1c-4d06-a3f8-651a92dea82d 2026-03-08T23:01:39.033 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:01:39.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDT/61pE/0EAxAAdDrX0K8MYPpvPlKgixPBvg== 2026-03-08T23:01:39.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDT/61pE/0EAxAAdDrX0K8MYPpvPlKgixPBvg=="}' 2026-03-08T23:01:39.049 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 7577b3a8-ca1c-4d06-a3f8-651a92dea82d -i td/test-erasure-eio/2/new.json 2026-03-08T23:01:39.357 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T23:01:39.366 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/2/new.json 2026-03-08T23:01:39.367 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDT/61pE/0EAxAAdDrX0K8MYPpvPlKgixPBvg== --osd-uuid 7577b3a8-ca1c-4d06-a3f8-651a92dea82d 2026-03-08T23:01:39.386 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:39.388+0000 7fd8e3c2e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:39.389 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:39.391+0000 7fd8e3c2e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:39.390 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:39.392+0000 7fd8e3c2e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:39.390 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:39.392+0000 7fd8e3c2e780 -1 bdev(0x5637c990bc00 td/test-erasure-eio/2/block) open stat got: (1) Operation not permitted 2026-03-08T23:01:39.390 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:39.393+0000 7fd8e3c2e780 -1 bluestore(td/test-erasure-eio/2) _read_fsid unparsable uuid 2026-03-08T23:01:41.536 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/2/keyring 2026-03-08T23:01:41.536 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:01:41.537 INFO:tasks.workunit.client.0.vm04.stdout:adding osd2 key to auth repository 2026-03-08T23:01:41.537 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T23:01:41.537 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:01:41.865 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T23:01:41.866 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T23:01:41.866 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:01:41.866 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:01:41.867 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:01:41.869 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:01:41.887 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:41.887+0000 7f2b6a62e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:41.892 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:41.894+0000 7f2b6a62e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:41.894 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:41.895+0000 7f2b6a62e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:42.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T23:01:42.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:01:42.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:01:42.104 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T23:01:42.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:01:42.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:01:42.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:42.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:01:42.104 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:42.105 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:01:42.327 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:43.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:43.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:43.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:01:43.328 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T23:01:43.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:43.328 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:01:43.481 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:43.481+0000 7f2b6a62e780 -1 Falling back to public interface 2026-03-08T23:01:43.566 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:44.568 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T23:01:44.571 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:44.571 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:44.571 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:01:44.571 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:44.572 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:01:44.772 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:44.774+0000 7f2b6a62e780 -1 osd.2 0 log_to_monitors true 2026-03-08T23:01:44.800 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:45.803 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T23:01:45.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:45.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:45.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:01:45.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:45.803 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:01:46.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:46.975 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:46.977+0000 7f2b65b7c640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T23:01:47.046 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:47.047 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-08T23:01:47.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:47.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:01:47.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:47.047 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:01:47.293 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up in weight 1 up_from 19 up_thru 19 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/1652042794,v1:127.0.0.1:6819/1652042794] [v2:127.0.0.1:6820/1652042794,v1:127.0.0.1:6821/1652042794] exists,up 7577b3a8-ca1c-4d06-a3f8-651a92dea82d 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:50: setup_osds: for id in $(seq 0 $(expr $count - 1)) 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:51: setup_osds: run_osd td/test-erasure-eio 3 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/test-erasure-eio 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/test-erasure-eio/3 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/test-erasure-eio/3' 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/3/journal' 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:01:47.294 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:01:47.295 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:01:47.295 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:47.295 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:01:47.295 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:01:47.295 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:01:47.295 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:01:47.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:01:47.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:01:47.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T23:01:47.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T23:01:47.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:01:47.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:01:47.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:01:47.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:01:47.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:01:47.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:01:47.296 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/test-erasure-eio/3 2026-03-08T23:01:47.298 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:01:47.299 INFO:tasks.workunit.client.0.vm04.stdout:add osd3 9876f57d-2603-46a6-86b9-1785c9598aaa 2026-03-08T23:01:47.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=9876f57d-2603-46a6-86b9-1785c9598aaa 2026-03-08T23:01:47.299 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 9876f57d-2603-46a6-86b9-1785c9598aaa' 2026-03-08T23:01:47.299 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:01:47.312 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDb/61p/d24EhAA8VXkhjxQY2swOK/MBwnbCw== 2026-03-08T23:01:47.313 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDb/61p/d24EhAA8VXkhjxQY2swOK/MBwnbCw=="}' 2026-03-08T23:01:47.313 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 9876f57d-2603-46a6-86b9-1785c9598aaa -i td/test-erasure-eio/3/new.json 2026-03-08T23:01:47.538 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T23:01:47.549 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/test-erasure-eio/3/new.json 2026-03-08T23:01:47.550 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDb/61p/d24EhAA8VXkhjxQY2swOK/MBwnbCw== --osd-uuid 9876f57d-2603-46a6-86b9-1785c9598aaa 2026-03-08T23:01:47.570 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:47.572+0000 7f1b24190780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:47.572 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:47.574+0000 7f1b24190780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:47.573 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:47.576+0000 7f1b24190780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:47.574 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:47.576+0000 7f1b24190780 -1 bdev(0x564c65645c00 td/test-erasure-eio/3/block) open stat got: (1) Operation not permitted 2026-03-08T23:01:47.574 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:47.576+0000 7f1b24190780 -1 bluestore(td/test-erasure-eio/3) _read_fsid unparsable uuid 2026-03-08T23:01:49.978 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/test-erasure-eio/3/keyring 2026-03-08T23:01:49.978 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:01:49.979 INFO:tasks.workunit.client.0.vm04.stdout:adding osd3 key to auth repository 2026-03-08T23:01:49.979 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T23:01:49.979 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/test-erasure-eio/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:01:50.284 INFO:tasks.workunit.client.0.vm04.stdout:start osd.3 2026-03-08T23:01:50.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T23:01:50.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/3 --osd-journal=td/test-erasure-eio/3/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:01:50.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:01:50.286 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:01:50.287 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:01:50.303 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:50.305+0000 7f10d1c0c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:50.309 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:50.311+0000 7f10d1c0c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:50.311 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:50.312+0000 7f10d1c0c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:50.516 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T23:01:50.516 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:01:50.516 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T23:01:50.516 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:01:50.516 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:01:50.516 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:50.516 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:01:50.516 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T23:01:50.516 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:50.516 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:01:50.738 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:50.866 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:50.868+0000 7f10d1c0c780 -1 Falling back to public interface 2026-03-08T23:01:51.732 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:51.734+0000 7f10d1c0c780 -1 osd.3 0 log_to_monitors true 2026-03-08T23:01:51.739 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T23:01:51.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:51.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:51.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:01:51.739 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:51.740 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:01:51.986 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:52.637 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:01:52.639+0000 7f10cccd3640 -1 osd.3 0 waiting for initial osdmap 2026-03-08T23:01:52.988 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T23:01:52.988 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:52.988 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:52.988 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:01:52.988 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:01:52.989 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:53.352 INFO:tasks.workunit.client.0.vm04.stdout:osd.3 up in weight 1 up_from 27 up_thru 28 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/3895592958,v1:127.0.0.1:6827/3895592958] [v2:127.0.0.1:6828/3895592958,v1:127.0.0.1:6829/3895592958] exists,up 9876f57d-2603-46a6-86b9-1785c9598aaa 2026-03-08T23:01:53.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:01:53.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:01:53.352 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:01:53.352 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: get_asok_path osd.0 2026-03-08T23:01:53.353 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T23:01:53.353 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T23:01:53.353 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:01:53.353 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:53.353 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:01:53.353 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.0.asok 2026-03-08T23:01:53.354 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: CEPH_ARGS= 2026-03-08T23:01:53.354 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:55: setup_osds: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.0.asok log flush 2026-03-08T23:01:53.410 INFO:tasks.workunit.client.0.vm04.stdout:{} 2026-03-08T23:01:53.416 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:56: setup_osds: grep 'load: jerasure.*lrc' td/test-erasure-eio/osd.0.log 2026-03-08T23:01:53.418 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-08T23:01:28.562+0000 7f5697212780 0 load: jerasure load: lrc 2026-03-08T23:01:53.418 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:349: TEST_rados_get_with_subreadall_eio_shard_1: local poolname=pool-jerasure 2026-03-08T23:01:53.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:350: TEST_rados_get_with_subreadall_eio_shard_1: create_erasure_coded_pool pool-jerasure 2 1 2026-03-08T23:01:53.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:67: create_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T23:01:53.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:68: create_erasure_coded_pool: shift 2026-03-08T23:01:53.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:69: create_erasure_coded_pool: local k=2 2026-03-08T23:01:53.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:70: create_erasure_coded_pool: shift 2026-03-08T23:01:53.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:71: create_erasure_coded_pool: local m=1 2026-03-08T23:01:53.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:72: create_erasure_coded_pool: shift 2026-03-08T23:01:53.419 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:74: create_erasure_coded_pool: ceph osd erasure-code-profile set myprofile plugin=jerasure k=2 m=1 crush-failure-domain=osd 2026-03-08T23:01:53.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:78: create_erasure_coded_pool: create_pool pool-jerasure 1 1 erasure myprofile 2026-03-08T23:01:53.719 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create pool-jerasure 1 1 erasure myprofile 2026-03-08T23:01:54.093 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' already exists 2026-03-08T23:01:54.105 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:01:55.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:80: create_erasure_coded_pool: wait_for_clean 2026-03-08T23:01:55.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:01:55.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:01:55.106 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:01:55.106 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:01:55.107 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:01:55.107 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:01:55.107 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:01:55.107 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:01:55.107 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:01:55.214 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:01:55.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:01:55.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:01:55.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:01:55.215 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:01:55.215 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:01:55.455 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:01:55.455 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T23:01:55.455 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T23:01:55.455 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T23:01:55.455 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:01:55.455 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:55.455 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:01:55.544 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803783 2026-03-08T23:01:55.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803783 2026-03-08T23:01:55.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783' 2026-03-08T23:01:55.545 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:55.545 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:01:55.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574854 2026-03-08T23:01:55.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574854 2026-03-08T23:01:55.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783 1-55834574854' 2026-03-08T23:01:55.630 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:55.630 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:01:55.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378628 2026-03-08T23:01:55.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378628 2026-03-08T23:01:55.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783 1-55834574854 2-81604378628' 2026-03-08T23:01:55.714 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:55.714 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:01:55.794 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964116995 2026-03-08T23:01:55.795 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964116995 2026-03-08T23:01:55.795 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803783 1-55834574854 2-81604378628 3-115964116995' 2026-03-08T23:01:55.795 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:55.795 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803783 2026-03-08T23:01:55.795 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:55.796 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:01:55.796 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803783 2026-03-08T23:01:55.796 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:55.797 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803783 2026-03-08T23:01:55.797 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803783' 2026-03-08T23:01:55.797 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803783 2026-03-08T23:01:55.797 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:01:56.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803781 -lt 25769803783 2026-03-08T23:01:56.031 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:01:57.032 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:01:57.033 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:01:57.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803784 -lt 25769803783 2026-03-08T23:01:57.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:57.284 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574854 2026-03-08T23:01:57.284 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:57.285 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:01:57.286 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574854 2026-03-08T23:01:57.286 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:57.287 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574854 2026-03-08T23:01:57.287 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574854' 2026-03-08T23:01:57.287 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 55834574854 2026-03-08T23:01:57.288 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:01:57.521 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574854 -lt 55834574854 2026-03-08T23:01:57.521 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:57.521 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-81604378628 2026-03-08T23:01:57.521 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:57.522 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:01:57.522 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-81604378628 2026-03-08T23:01:57.522 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:57.523 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378628 2026-03-08T23:01:57.523 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 81604378628' 2026-03-08T23:01:57.523 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 81604378628 2026-03-08T23:01:57.524 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:01:57.747 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378628 -lt 81604378628 2026-03-08T23:01:57.747 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:57.747 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964116995 2026-03-08T23:01:57.747 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:57.748 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:01:57.749 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964116995 2026-03-08T23:01:57.749 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:57.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964116995 2026-03-08T23:01:57.750 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964116995' 2026-03-08T23:01:57.750 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 115964116995 2026-03-08T23:01:57.750 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:01:57.979 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964116995 -lt 115964116995 2026-03-08T23:01:57.980 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:01:57.980 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:01:57.980 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:01:58.315 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:01:58.315 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:01:58.316 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:01:58.316 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:01:58.316 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:01:58.316 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:01:58.316 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:01:58.316 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:01:58.549 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:01:58.550 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:01:58.550 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:01:58.550 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:01:58.894 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:01:58.894 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:01:58.894 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:352: TEST_rados_get_with_subreadall_eio_shard_1: rados_put_get_data eio td/test-erasure-eio 1 recovery 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:147: rados_put_get_data: local inject=eio 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:148: rados_put_get_data: shift 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:149: rados_put_get_data: local dir=td/test-erasure-eio 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:150: rados_put_get_data: shift 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:151: rados_put_get_data: local shard_id=1 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:152: rados_put_get_data: shift 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:153: rados_put_get_data: local arg=recovery 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:157: rados_put_get_data: local poolname=pool-jerasure 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:158: rados_put_get_data: local objname=obj-eio-102080-1 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:159: rados_put_get_data: rados_put td/test-erasure-eio pool-jerasure obj-eio-102080-1 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:90: rados_put: local dir=td/test-erasure-eio 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:91: rados_put: local poolname=pool-jerasure 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:92: rados_put: local objname=obj-eio-102080-1 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 AAA 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 BBB 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 CCCC 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:94: rados_put: for marker in AAA BBB CCCC DDDD 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:95: rados_put: printf '%*s' 1024 DDDD 2026-03-08T23:01:58.895 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:100: rados_put: rados --pool pool-jerasure put obj-eio-102080-1 td/test-erasure-eio/ORIGINAL 2026-03-08T23:01:58.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:160: rados_put_get_data: inject_eio ec data pool-jerasure obj-eio-102080-1 td/test-erasure-eio 1 2026-03-08T23:01:58.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=ec 2026-03-08T23:01:58.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T23:01:58.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=data 2026-03-08T23:01:58.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T23:01:58.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=pool-jerasure 2026-03-08T23:01:58.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T23:01:58.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=obj-eio-102080-1 2026-03-08T23:01:58.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T23:01:58.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/test-erasure-eio 2026-03-08T23:01:58.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T23:01:58.941 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=1 2026-03-08T23:01:58.942 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T23:01:58.942 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds pool-jerasure obj-eio-102080-1 2026-03-08T23:01:58.942 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T23:01:58.942 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-eio-102080-1 2026-03-08T23:01:58.942 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-eio-102080-1 2026-03-08T23:01:58.942 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:01:59.196 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T23:01:59.196 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T23:01:59.196 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T23:01:59.196 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T23:01:59.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('3' '1' '2') 2026-03-08T23:01:59.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T23:01:59.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=1 2026-03-08T23:01:59.196 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' ec '!=' ec ']' 2026-03-08T23:01:59.196 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/test-erasure-eio/1/type 2026-03-08T23:01:59.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T23:01:59.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 1 bluestore_debug_inject_read_err true 2026-03-08T23:01:59.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T23:01:59.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=1 2026-03-08T23:01:59.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T23:01:59.197 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T23:01:59.198 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T23:01:59.198 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.1 2026-03-08T23:01:59.198 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T23:01:59.198 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T23:01:59.199 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:01:59.199 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:59.199 INFO:tasks.workunit.client.0.vm04.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:01:59.199 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.1.asok 2026-03-08T23:01:59.199 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.102080/ceph-osd.1.asok config set bluestore_debug_inject_read_err true 2026-03-08T23:01:59.255 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T23:01:59.255 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T23:01:59.256 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T23:01:59.256 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.1 2026-03-08T23:01:59.256 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T23:01:59.256 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T23:01:59.257 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:01:59.257 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:59.257 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:01:59.257 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.102080/ceph-osd.1.asok 2026-03-08T23:01:59.257 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T23:01:59.257 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.102080/ceph-osd.1.asok injectdataerr pool-jerasure obj-eio-102080-1 1 2026-03-08T23:01:59.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:161: rados_put_get_data: rados_get td/test-erasure-eio pool-jerasure obj-eio-102080-1 2026-03-08T23:01:59.321 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:104: rados_get: local dir=td/test-erasure-eio 2026-03-08T23:01:59.322 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:105: rados_get: local poolname=pool-jerasure 2026-03-08T23:01:59.322 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:106: rados_get: local objname=obj-eio-102080-1 2026-03-08T23:01:59.322 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:107: rados_get: local expect=ok 2026-03-08T23:01:59.322 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:112: rados_get: '[' ok = fail ']' 2026-03-08T23:01:59.322 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:120: rados_get: rados --pool pool-jerasure get obj-eio-102080-1 td/test-erasure-eio/COPY 2026-03-08T23:01:59.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:121: rados_get: diff td/test-erasure-eio/ORIGINAL td/test-erasure-eio/COPY 2026-03-08T23:01:59.348 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:122: rados_get: rm td/test-erasure-eio/COPY 2026-03-08T23:01:59.349 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:163: rados_put_get_data: '[' recovery = recovery ']' 2026-03-08T23:01:59.349 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:170: rados_put_get_data: get_osds pool-jerasure obj-eio-102080-1 2026-03-08T23:01:59.349 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T23:01:59.349 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-eio-102080-1 2026-03-08T23:01:59.350 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-eio-102080-1 2026-03-08T23:01:59.350 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:01:59.616 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T23:01:59.616 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T23:01:59.616 INFO:tasks.workunit.client.0.vm04.stderr:2' 2026-03-08T23:01:59.616 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 2026-03-08T23:01:59.617 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:170: rados_put_get_data: initial_osds=('3' '1' '2') 2026-03-08T23:01:59.617 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:170: rados_put_get_data: local -a initial_osds 2026-03-08T23:01:59.617 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:171: rados_put_get_data: local last_osd=2 2026-03-08T23:01:59.617 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:173: rados_put_get_data: kill_daemons td/test-erasure-eio TERM osd.2 2026-03-08T23:01:59.617 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:01:59.617 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:01:59.618 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:01:59.618 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:01:59.618 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:01:59.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:01:59.725 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:174: rados_put_get_data: ceph osd out 2 2026-03-08T23:02:00.016 INFO:tasks.workunit.client.0.vm04.stderr:osd.2 is already out. 2026-03-08T23:02:00.027 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:175: rados_put_get_data: grep '\<2\>' 2026-03-08T23:02:00.027 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:175: rados_put_get_data: get_osds pool-jerasure obj-eio-102080-1 2026-03-08T23:02:00.027 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=pool-jerasure 2026-03-08T23:02:00.027 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=obj-eio-102080-1 2026-03-08T23:02:00.027 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map pool-jerasure obj-eio-102080-1 2026-03-08T23:02:00.028 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:02:00.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T23:02:00.282 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T23:02:00.282 INFO:tasks.workunit.client.0.vm04.stderr:0' 2026-03-08T23:02:00.282 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 0 2026-03-08T23:02:00.283 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:176: rados_put_get_data: ceph osd in 2 2026-03-08T23:02:00.541 INFO:tasks.workunit.client.0.vm04.stderr:osd.2 is already in. 2026-03-08T23:02:00.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:177: rados_put_get_data: activate_osd td/test-erasure-eio 2 2026-03-08T23:02:00.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/test-erasure-eio 2026-03-08T23:02:00.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:02:00.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T23:02:00.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:02:00.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/test-erasure-eio/2 2026-03-08T23:02:00.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true ' 2026-03-08T23:02:00.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:02:00.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:02:00.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:02:00.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/test-erasure-eio/2' 2026-03-08T23:02:00.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/test-erasure-eio/2/journal' 2026-03-08T23:02:00.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:02:00.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:02:00.554 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/test-erasure-eio' 2026-03-08T23:02:00.555 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:02:00.555 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:02:00.555 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:02:00.555 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:02:00.555 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:02:00.556 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:02:00.556 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:02:00.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' 2026-03-08T23:02:00.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:02:00.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/test-erasure-eio/$name.log' 2026-03-08T23:02:00.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/test-erasure-eio/$name.pid' 2026-03-08T23:02:00.556 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:02:00.557 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:02:00.557 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:02:00.557 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:02:00.557 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:02:00.557 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:02:00.557 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/test-erasure-eio/2 2026-03-08T23:02:00.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T23:02:00.558 INFO:tasks.workunit.client.0.vm04.stdout:start osd.2 2026-03-08T23:02:00.558 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=fc7ae229-234e-46c3-818a-2e4e2ac45700 --auth-supported=none --mon-host=127.0.0.1:7112 --osd_mclock_override_recovery_settings=true --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/test-erasure-eio/2 --osd-journal=td/test-erasure-eio/2/journal --chdir= --run-dir=td/test-erasure-eio '--admin-socket=/tmp/ceph-asok.102080/$cluster-$name.asok' --debug-osd=20 '--log-file=td/test-erasure-eio/$name.log' '--pid-file=td/test-erasure-eio/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:02:00.559 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/test-erasure-eio/2/whoami 2026-03-08T23:02:00.560 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T23:02:00.561 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:02:00.562 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:02:00.565 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:02:00.589 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:02:00.590+0000 7fbbab813780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:00.589 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:02:00.591+0000 7fbbab813780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:00.591 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:02:00.593+0000 7fbbab813780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:00.811 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T23:02:00.811 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:02:00.812 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-08T23:02:00.812 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:02:00.812 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:02:00.812 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:02:00.812 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:00.812 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:02:00.812 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:00.812 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:02:01.043 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:01.672 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:02:01.674+0000 7fbbab813780 -1 Falling back to public interface 2026-03-08T23:02:02.044 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:02.044 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:02.044 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-08T23:02:02.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:02:02.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:02.045 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:02:02.273 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:02.545 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:02:02.547+0000 7fbbab813780 -1 osd.2 34 log_to_monitors true 2026-03-08T23:02:03.276 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-08T23:02:03.277 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:03.277 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:03.277 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:02:03.277 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:03.277 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:02:03.498 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:04.435 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-08T23:02:04.436+0000 7fbba208b640 -1 osd.2 34 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:02:04.505 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-08T23:02:04.505 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:04.505 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:04.505 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:02:04.505 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:04.505 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:02:04.768 INFO:tasks.workunit.client.0.vm04.stdout:osd.2 up in weight 1 up_from 43 up_thru 19 down_at 35 last_clean_interval [19,34) [v2:127.0.0.1:6818/621074452,v1:127.0.0.1:6819/621074452] [v2:127.0.0.1:6820/621074452,v1:127.0.0.1:6821/621074452] exists,up 7577b3a8-ca1c-4d06-a3f8-651a92dea82d 2026-03-08T23:02:04.768 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:02:04.768 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:02:04.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:02:04.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:178: rados_put_get_data: wait_for_clean 2026-03-08T23:02:04.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:02:04.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:02:04.769 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:02:04.769 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:02:04.769 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:02:04.769 INFO:tasks.workunit.client.0.vm04.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:02:04.769 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:02:04.769 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:02:04.769 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:02:04.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:02:04.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:02:04.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:02:04.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:02:04.851 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:02:04.852 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:02:05.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:02:05.094 INFO:tasks.workunit.client.0.vm04.stderr:1 2026-03-08T23:02:05.094 INFO:tasks.workunit.client.0.vm04.stderr:2 2026-03-08T23:02:05.094 INFO:tasks.workunit.client.0.vm04.stderr:3' 2026-03-08T23:02:05.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:02:05.094 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:05.094 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:02:05.184 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=25769803787 2026-03-08T23:02:05.184 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 25769803787 2026-03-08T23:02:05.184 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787' 2026-03-08T23:02:05.184 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:05.184 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:02:05.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=55834574858 2026-03-08T23:02:05.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 55834574858 2026-03-08T23:02:05.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574858' 2026-03-08T23:02:05.267 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:05.267 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:02:05.351 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=184683593731 2026-03-08T23:02:05.351 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 184683593731 2026-03-08T23:02:05.351 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574858 2-184683593731' 2026-03-08T23:02:05.351 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:05.351 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:02:05.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=115964116999 2026-03-08T23:02:05.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 115964116999 2026-03-08T23:02:05.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-25769803787 1-55834574858 2-184683593731 3-115964116999' 2026-03-08T23:02:05.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:05.431 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-25769803787 2026-03-08T23:02:05.431 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:05.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:02:05.433 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-25769803787 2026-03-08T23:02:05.433 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:05.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=25769803787 2026-03-08T23:02:05.434 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.0 seq 25769803787 2026-03-08T23:02:05.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 25769803787' 2026-03-08T23:02:05.434 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:02:05.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803785 -lt 25769803787 2026-03-08T23:02:05.660 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:02:06.661 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:02:06.661 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:02:06.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803785 -lt 25769803787 2026-03-08T23:02:06.904 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:02:07.905 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:02:07.906 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:02:08.169 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 25769803788 -lt 25769803787 2026-03-08T23:02:08.169 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:08.170 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-55834574858 2026-03-08T23:02:08.170 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:08.171 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:02:08.172 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-55834574858 2026-03-08T23:02:08.172 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:08.174 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=55834574858 2026-03-08T23:02:08.174 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 55834574858' 2026-03-08T23:02:08.174 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.1 seq 55834574858 2026-03-08T23:02:08.174 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:02:08.417 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 55834574858 -lt 55834574858 2026-03-08T23:02:08.458 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:08.459 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-184683593731 2026-03-08T23:02:08.459 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:08.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:02:08.459 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-184683593731 2026-03-08T23:02:08.459 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:08.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=184683593731 2026-03-08T23:02:08.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 184683593731' 2026-03-08T23:02:08.459 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.2 seq 184683593731 2026-03-08T23:02:08.459 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:02:08.663 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 184683593731 -lt 184683593731 2026-03-08T23:02:08.664 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:08.664 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-115964116999 2026-03-08T23:02:08.664 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:08.665 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:02:08.666 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-115964116999 2026-03-08T23:02:08.666 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:08.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=115964116999 2026-03-08T23:02:08.668 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 115964116999' 2026-03-08T23:02:08.668 INFO:tasks.workunit.client.0.vm04.stdout:waiting osd.3 seq 115964116999 2026-03-08T23:02:08.668 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:02:08.910 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 115964116999 -lt 115964116999 2026-03-08T23:02:08.911 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:02:08.911 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:02:08.911 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:02:09.249 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:02:09.249 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:02:09.249 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:02:09.249 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:02:09.249 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:02:09.249 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:02:09.249 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:02:09.249 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:02:09.471 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:02:09.471 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:02:09.472 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:02:09.472 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:02:09.787 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:02:09.787 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:02:09.787 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:02:09.787 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:354: TEST_rados_get_with_subreadall_eio_shard_1: delete_erasure_coded_pool pool-jerasure 2026-03-08T23:02:09.787 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:84: delete_erasure_coded_pool: local poolname=pool-jerasure 2026-03-08T23:02:09.787 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:85: delete_erasure_coded_pool: ceph osd pool delete pool-jerasure pool-jerasure --yes-i-really-really-mean-it 2026-03-08T23:02:10.011 INFO:tasks.workunit.client.0.vm04.stderr:pool 'pool-jerasure' does not exist 2026-03-08T23:02:10.022 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:86: delete_erasure_coded_pool: ceph osd erasure-code-profile rm myprofile 2026-03-08T23:02:10.296 INFO:tasks.workunit.client.0.vm04.stderr:erasure-code-profile myprofile does not exist 2026-03-08T23:02:10.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/erasure-code/test-erasure-eio.sh:42: run: teardown td/test-erasure-eio 2026-03-08T23:02:10.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T23:02:10.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:02:10.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T23:02:10.306 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:02:10.306 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:02:10.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:02:10.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:02:10.306 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:02:10.431 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:02:10.431 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:02:10.432 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:02:10.432 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:02:10.433 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T23:02:10.433 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:02:10.434 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:02:10.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:02:10.434 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:02:10.435 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:02:10.435 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:02:10.436 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:02:10.437 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:02:10.437 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T23:02:10.456 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:02:10.456 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:02:10.456 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:02:10.456 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T23:02:10.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:02:10.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:02:10.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2377: main: code=0 2026-03-08T23:02:10.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2381: main: teardown td/test-erasure-eio 0 2026-03-08T23:02:10.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/test-erasure-eio 2026-03-08T23:02:10.459 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs=0 2026-03-08T23:02:10.460 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/test-erasure-eio KILL 2026-03-08T23:02:10.460 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:02:10.460 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:02:10.460 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:02:10.460 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:02:10.460 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:02:10.461 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:02:10.461 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:02:10.462 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:02:10.462 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:02:10.463 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T23:02:10.463 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:02:10.464 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:02:10.464 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:02:10.464 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:02:10.465 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:02:10.465 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:02:10.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:02:10.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o 0 = 1 ']' 2026-03-08T23:02:10.466 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/test-erasure-eio 2026-03-08T23:02:10.467 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:02:10.467 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:02:10.467 INFO:tasks.workunit.client.0.vm04.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.102080 2026-03-08T23:02:10.468 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.102080 2026-03-08T23:02:10.468 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:02:10.468 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:02:10.468 INFO:tasks.workunit.client.0.vm04.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2382: main: return 0 2026-03-08T23:02:10.469 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-08T23:02:10.469 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-08T23:02:10.540 INFO:tasks.workunit:Stopping ['erasure-code'] on client.0... 2026-03-08T23:02:10.541 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-08T23:02:10.978 DEBUG:teuthology.parallel:result is None 2026-03-08T23:02:10.979 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-08T23:02:11.005 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-08T23:02:11.005 DEBUG:teuthology.orchestra.run.vm04:> rmdir -- /home/ubuntu/cephtest/mnt.0 2026-03-08T23:02:11.067 INFO:tasks.workunit:Deleted artificial mount point /home/ubuntu/cephtest/mnt.0/client.0 2026-03-08T23:02:11.067 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-08T23:02:11.069 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-08T23:02:11.069 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-08T23:02:11.140 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, ceph-volume, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-08T23:02:11.141 DEBUG:teuthology.orchestra.run.vm04:> 2026-03-08T23:02:11.141 DEBUG:teuthology.orchestra.run.vm04:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse ceph-volume librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-08T23:02:11.141 DEBUG:teuthology.orchestra.run.vm04:> sudo yum -y remove $d || true 2026-03-08T23:02:11.141 DEBUG:teuthology.orchestra.run.vm04:> done 2026-03-08T23:02:11.363 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:11.364 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:11.364 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-08T23:02:11.364 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:11.364 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-08T23:02:11.364 INFO:teuthology.orchestra.run.vm04.stdout: ceph-radosgw x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 39 M 2026-03-08T23:02:11.364 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-08T23:02:11.364 INFO:teuthology.orchestra.run.vm04.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-08T23:02:11.364 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:11.364 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-08T23:02:11.364 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:11.364 INFO:teuthology.orchestra.run.vm04.stdout:Remove 2 Packages 2026-03-08T23:02:11.364 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:11.364 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 39 M 2026-03-08T23:02:11.364 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-08T23:02:11.366 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-08T23:02:11.366 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-08T23:02:11.380 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-08T23:02:11.381 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-08T23:02:11.417 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-08T23:02:11.446 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 1/2 2026-03-08T23:02:11.447 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:02:11.447 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-08T23:02:11.447 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-08T23:02:11.447 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-08T23:02:11.447 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:11.451 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 1/2 2026-03-08T23:02:11.463 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 1/2 2026-03-08T23:02:11.478 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-08T23:02:11.552 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-08T23:02:11.552 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 1/2 2026-03-08T23:02:11.612 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-08T23:02:11.612 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:11.612 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-08T23:02:11.612 INFO:teuthology.orchestra.run.vm04.stdout: ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-08T23:02:11.612 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:11.612 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:11.834 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:11.835 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:11.835 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-08T23:02:11.835 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:11.835 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-08T23:02:11.835 INFO:teuthology.orchestra.run.vm04.stdout: ceph-test x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 210 M 2026-03-08T23:02:11.835 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-08T23:02:11.835 INFO:teuthology.orchestra.run.vm04.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-08T23:02:11.835 INFO:teuthology.orchestra.run.vm04.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-08T23:02:11.835 INFO:teuthology.orchestra.run.vm04.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-08T23:02:11.835 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:11.835 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-08T23:02:11.835 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:11.835 INFO:teuthology.orchestra.run.vm04.stdout:Remove 4 Packages 2026-03-08T23:02:11.835 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:11.835 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 212 M 2026-03-08T23:02:11.835 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-08T23:02:11.838 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-08T23:02:11.839 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-08T23:02:11.862 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-08T23:02:11.863 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-08T23:02:11.930 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-08T23:02:11.937 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-test-2:19.2.3-678.ge911bdeb.el9.x86_64 1/4 2026-03-08T23:02:11.940 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-08T23:02:11.945 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-08T23:02:11.961 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-08T23:02:12.049 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-08T23:02:12.049 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-test-2:19.2.3-678.ge911bdeb.el9.x86_64 1/4 2026-03-08T23:02:12.049 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-08T23:02:12.049 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-08T23:02:12.116 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-08T23:02:12.116 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:12.116 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-08T23:02:12.117 INFO:teuthology.orchestra.run.vm04.stdout: ceph-test-2:19.2.3-678.ge911bdeb.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-08T23:02:12.117 INFO:teuthology.orchestra.run.vm04.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-08T23:02:12.117 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:12.117 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:12.340 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:12.341 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:12.341 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-08T23:02:12.341 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:12.341 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-08T23:02:12.341 INFO:teuthology.orchestra.run.vm04.stdout: ceph x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 0 2026-03-08T23:02:12.341 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-08T23:02:12.341 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mds x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 7.5 M 2026-03-08T23:02:12.341 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mon x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 18 M 2026-03-08T23:02:12.341 INFO:teuthology.orchestra.run.vm04.stdout: lua x86_64 5.4.4-4.el9 @appstream 593 k 2026-03-08T23:02:12.341 INFO:teuthology.orchestra.run.vm04.stdout: lua-devel x86_64 5.4.4-4.el9 @crb 49 k 2026-03-08T23:02:12.341 INFO:teuthology.orchestra.run.vm04.stdout: luarocks noarch 3.9.2-5.el9 @epel 692 k 2026-03-08T23:02:12.341 INFO:teuthology.orchestra.run.vm04.stdout: unzip x86_64 6.0-59.el9 @baseos 389 k 2026-03-08T23:02:12.341 INFO:teuthology.orchestra.run.vm04.stdout: zip x86_64 3.0-35.el9 @baseos 724 k 2026-03-08T23:02:12.341 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:12.341 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-08T23:02:12.341 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:12.341 INFO:teuthology.orchestra.run.vm04.stdout:Remove 8 Packages 2026-03-08T23:02:12.341 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:12.341 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 28 M 2026-03-08T23:02:12.341 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-08T23:02:12.344 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-08T23:02:12.344 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-08T23:02:12.369 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-08T23:02:12.369 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-08T23:02:12.415 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-08T23:02:12.421 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-2:19.2.3-678.ge911bdeb.el9.x86_64 1/8 2026-03-08T23:02:12.425 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : luarocks-3.9.2-5.el9.noarch 2/8 2026-03-08T23:02:12.428 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : lua-devel-5.4.4-4.el9.x86_64 3/8 2026-03-08T23:02:12.431 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : zip-3.0-35.el9.x86_64 4/8 2026-03-08T23:02:12.434 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : unzip-6.0-59.el9.x86_64 5/8 2026-03-08T23:02:12.437 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : lua-5.4.4-4.el9.x86_64 6/8 2026-03-08T23:02:12.460 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 7/8 2026-03-08T23:02:12.460 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:02:12.460 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-08T23:02:12.460 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-08T23:02:12.460 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-08T23:02:12.460 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:12.461 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 7/8 2026-03-08T23:02:12.468 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 7/8 2026-03-08T23:02:12.488 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 8/8 2026-03-08T23:02:12.488 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:02:12.488 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-08T23:02:12.488 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-08T23:02:12.488 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-08T23:02:12.488 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:12.490 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 8/8 2026-03-08T23:02:12.580 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 8/8 2026-03-08T23:02:12.580 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-2:19.2.3-678.ge911bdeb.el9.x86_64 1/8 2026-03-08T23:02:12.580 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 2/8 2026-03-08T23:02:12.580 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 3/8 2026-03-08T23:02:12.580 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : lua-5.4.4-4.el9.x86_64 4/8 2026-03-08T23:02:12.580 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : lua-devel-5.4.4-4.el9.x86_64 5/8 2026-03-08T23:02:12.580 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : luarocks-3.9.2-5.el9.noarch 6/8 2026-03-08T23:02:12.580 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : unzip-6.0-59.el9.x86_64 7/8 2026-03-08T23:02:12.636 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : zip-3.0-35.el9.x86_64 8/8 2026-03-08T23:02:12.636 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:12.637 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-08T23:02:12.637 INFO:teuthology.orchestra.run.vm04.stdout: ceph-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:12.637 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:12.637 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:12.637 INFO:teuthology.orchestra.run.vm04.stdout: lua-5.4.4-4.el9.x86_64 2026-03-08T23:02:12.637 INFO:teuthology.orchestra.run.vm04.stdout: lua-devel-5.4.4-4.el9.x86_64 2026-03-08T23:02:12.637 INFO:teuthology.orchestra.run.vm04.stdout: luarocks-3.9.2-5.el9.noarch 2026-03-08T23:02:12.637 INFO:teuthology.orchestra.run.vm04.stdout: unzip-6.0-59.el9.x86_64 2026-03-08T23:02:12.637 INFO:teuthology.orchestra.run.vm04.stdout: zip-3.0-35.el9.x86_64 2026-03-08T23:02:12.637 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:12.637 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:12.862 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout:=========================================================================================== 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout:=========================================================================================== 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: ceph-base x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 23 M 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout:Removing dependent packages: 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: ceph-immutable-object-cache x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 431 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 3.4 M 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-cephadm noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 806 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-dashboard noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 88 M 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-diskprediction-local noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 66 M 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-rook noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 563 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: ceph-osd x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 59 M 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: ceph-volume noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 1.4 M 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: rbd-mirror x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 13 M 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: abseil-cpp x86_64 20211102.0-4.el9 @epel 1.9 M 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: ceph-common x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 85 M 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: ceph-grafana-dashboards noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 628 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 1.5 M 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: ceph-prometheus-alerts noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 52 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: ceph-selinux x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 138 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: cryptsetup x86_64 2.8.1-3.el9 @baseos 770 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: grpc-data noarch 1.46.7-10.el9 @epel 13 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: libcephsqlite x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 425 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: libradosstriper1 x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 1.6 M 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: pciutils x86_64 3.7.0-7.el9 @baseos 216 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: protobuf x86_64 3.14.0-17.el9 @appstream 3.5 M 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: protobuf-compiler x86_64 3.14.0-17.el9 @crb 2.9 M 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-08T23:02:12.868 INFO:teuthology.orchestra.run.vm04.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-common x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 702 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-grpcio x86_64 1.46.7-10.el9 @epel 6.7 M 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-grpcio-tools x86_64 1.46.7-10.el9 @epel 418 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-packaging noarch 20.9-5.el9 @appstream 248 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-protobuf noarch 3.14.0-17.el9 @appstream 1.4 M 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyparsing noarch 2.4.7-9.el9 @baseos 635 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-08T23:02:12.869 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-08T23:02:12.870 INFO:teuthology.orchestra.run.vm04.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-08T23:02:12.870 INFO:teuthology.orchestra.run.vm04.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-08T23:02:12.870 INFO:teuthology.orchestra.run.vm04.stdout: qatlib x86_64 25.08.0-2.el9 @appstream 639 k 2026-03-08T23:02:12.870 INFO:teuthology.orchestra.run.vm04.stdout: qatlib-service x86_64 25.08.0-2.el9 @appstream 69 k 2026-03-08T23:02:12.870 INFO:teuthology.orchestra.run.vm04.stdout: qatzip-libs x86_64 1.3.1-1.el9 @appstream 148 k 2026-03-08T23:02:12.870 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:12.870 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-08T23:02:12.870 INFO:teuthology.orchestra.run.vm04.stdout:=========================================================================================== 2026-03-08T23:02:12.870 INFO:teuthology.orchestra.run.vm04.stdout:Remove 103 Packages 2026-03-08T23:02:12.870 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:12.870 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 613 M 2026-03-08T23:02:12.870 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-08T23:02:12.896 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-08T23:02:12.896 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-08T23:02:13.009 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-08T23:02:13.010 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-08T23:02:13.174 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-08T23:02:13.174 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 1/103 2026-03-08T23:02:13.186 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 1/103 2026-03-08T23:02:13.207 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 2/103 2026-03-08T23:02:13.207 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:02:13.207 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-08T23:02:13.207 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-08T23:02:13.207 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-08T23:02:13.207 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:13.208 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 2/103 2026-03-08T23:02:13.223 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 2/103 2026-03-08T23:02:13.250 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-modules-core-2:19.2.3-678.ge911bdeb.el9 3/103 2026-03-08T23:02:13.250 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.no 4/103 2026-03-08T23:02:13.310 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.no 4/103 2026-03-08T23:02:13.320 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 5/103 2026-03-08T23:02:13.325 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/103 2026-03-08T23:02:13.325 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noar 7/103 2026-03-08T23:02:13.342 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noar 7/103 2026-03-08T23:02:13.349 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/103 2026-03-08T23:02:13.354 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/103 2026-03-08T23:02:13.363 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-grpcio-tools-1.46.7-10.el9.x86_64 10/103 2026-03-08T23:02:13.368 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-grpcio-1.46.7-10.el9.x86_64 11/103 2026-03-08T23:02:13.390 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 12/103 2026-03-08T23:02:13.390 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:02:13.390 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-08T23:02:13.390 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-08T23:02:13.390 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-08T23:02:13.390 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:13.396 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 12/103 2026-03-08T23:02:13.406 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 12/103 2026-03-08T23:02:13.426 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 13/103 2026-03-08T23:02:13.426 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:02:13.426 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-volume@*.service" escaped as "ceph-volume@\x2a.service". 2026-03-08T23:02:13.426 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:13.434 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 13/103 2026-03-08T23:02:13.445 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 13/103 2026-03-08T23:02:13.448 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 14/103 2026-03-08T23:02:13.453 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 15/103 2026-03-08T23:02:13.457 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 16/103 2026-03-08T23:02:13.466 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 17/103 2026-03-08T23:02:13.478 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 18/103 2026-03-08T23:02:13.485 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 19/103 2026-03-08T23:02:13.495 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 20/103 2026-03-08T23:02:13.502 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 21/103 2026-03-08T23:02:13.532 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 22/103 2026-03-08T23:02:13.541 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 23/103 2026-03-08T23:02:13.544 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 24/103 2026-03-08T23:02:13.554 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 25/103 2026-03-08T23:02:13.566 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 26/103 2026-03-08T23:02:13.566 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-diskprediction-local-2:19.2.3-678.ge911 27/103 2026-03-08T23:02:13.574 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:19.2.3-678.ge911 27/103 2026-03-08T23:02:13.673 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 28/103 2026-03-08T23:02:13.689 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 29/103 2026-03-08T23:02:13.704 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 30/103 2026-03-08T23:02:13.704 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-08T23:02:13.704 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:13.705 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 30/103 2026-03-08T23:02:13.738 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 30/103 2026-03-08T23:02:13.755 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 31/103 2026-03-08T23:02:13.760 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 32/103 2026-03-08T23:02:13.763 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : protobuf-compiler-3.14.0-17.el9.x86_64 33/103 2026-03-08T23:02:13.765 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 34/103 2026-03-08T23:02:13.788 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 35/103 2026-03-08T23:02:13.788 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:02:13.788 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-08T23:02:13.788 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-08T23:02:13.788 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-08T23:02:13.788 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:13.790 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 35/103 2026-03-08T23:02:13.804 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 35/103 2026-03-08T23:02:13.809 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 36/103 2026-03-08T23:02:13.813 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 37/103 2026-03-08T23:02:13.817 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-packaging-20.9-5.el9.noarch 38/103 2026-03-08T23:02:13.820 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 39/103 2026-03-08T23:02:13.824 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 40/103 2026-03-08T23:02:13.827 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 41/103 2026-03-08T23:02:13.832 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 42/103 2026-03-08T23:02:13.837 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 43/103 2026-03-08T23:02:13.885 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 44/103 2026-03-08T23:02:13.897 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 45/103 2026-03-08T23:02:13.900 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 46/103 2026-03-08T23:02:13.905 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 47/103 2026-03-08T23:02:13.907 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 48/103 2026-03-08T23:02:13.911 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 49/103 2026-03-08T23:02:13.913 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 50/103 2026-03-08T23:02:13.935 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-immutable-object-cache-2:19.2.3-678.ge911bd 51/103 2026-03-08T23:02:13.935 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:02:13.935 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-08T23:02:13.935 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:13.935 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-immutable-object-cache-2:19.2.3-678.ge911bd 51/103 2026-03-08T23:02:13.945 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-immutable-object-cache-2:19.2.3-678.ge911bd 51/103 2026-03-08T23:02:13.947 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 52/103 2026-03-08T23:02:13.949 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 53/103 2026-03-08T23:02:13.953 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-ply-3.11-14.el9.noarch 54/103 2026-03-08T23:02:13.955 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 55/103 2026-03-08T23:02:13.957 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 56/103 2026-03-08T23:02:13.960 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 57/103 2026-03-08T23:02:13.963 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 58/103 2026-03-08T23:02:13.966 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 59/103 2026-03-08T23:02:13.969 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pyparsing-2.4.7-9.el9.noarch 60/103 2026-03-08T23:02:13.977 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 61/103 2026-03-08T23:02:13.982 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 62/103 2026-03-08T23:02:13.985 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 63/103 2026-03-08T23:02:13.988 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 64/103 2026-03-08T23:02:13.990 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 65/103 2026-03-08T23:02:13.996 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 66/103 2026-03-08T23:02:14.000 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 67/103 2026-03-08T23:02:14.006 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 68/103 2026-03-08T23:02:14.011 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 69/103 2026-03-08T23:02:14.017 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 70/103 2026-03-08T23:02:14.021 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 71/103 2026-03-08T23:02:14.024 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 72/103 2026-03-08T23:02:14.030 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : grpc-data-1.46.7-10.el9.noarch 73/103 2026-03-08T23:02:14.035 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-protobuf-3.14.0-17.el9.noarch 74/103 2026-03-08T23:02:14.039 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 75/103 2026-03-08T23:02:14.047 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 76/103 2026-03-08T23:02:14.053 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 77/103 2026-03-08T23:02:14.056 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 78/103 2026-03-08T23:02:14.059 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 79/103 2026-03-08T23:02:14.061 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-grafana-dashboards-2:19.2.3-678.ge911bdeb.e 80/103 2026-03-08T23:02:14.066 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-prometheus-alerts-2:19.2.3-678.ge911bdeb.el 81/103 2026-03-08T23:02:14.070 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 82/103 2026-03-08T23:02:14.091 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 83/103 2026-03-08T23:02:14.092 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph.target". 2026-03-08T23:02:14.092 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-08T23:02:14.092 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:14.099 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 83/103 2026-03-08T23:02:14.131 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 83/103 2026-03-08T23:02:14.131 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 84/103 2026-03-08T23:02:14.146 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 84/103 2026-03-08T23:02:14.151 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : qatzip-libs-1.3.1-1.el9.x86_64 85/103 2026-03-08T23:02:14.153 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-ceph-common-2:19.2.3-678.ge911bdeb.el9.x 86/103 2026-03-08T23:02:14.155 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 87/103 2026-03-08T23:02:14.155 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 88/103 2026-03-08T23:02:20.442 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 88/103 2026-03-08T23:02:20.442 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /sys 2026-03-08T23:02:20.442 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /proc 2026-03-08T23:02:20.442 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /mnt 2026-03-08T23:02:20.442 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /var/tmp 2026-03-08T23:02:20.442 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /home 2026-03-08T23:02:20.442 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /root 2026-03-08T23:02:20.442 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /tmp 2026-03-08T23:02:20.443 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:20.453 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : qatlib-25.08.0-2.el9.x86_64 89/103 2026-03-08T23:02:20.472 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: qatlib-service-25.08.0-2.el9.x86_64 90/103 2026-03-08T23:02:20.473 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : qatlib-service-25.08.0-2.el9.x86_64 90/103 2026-03-08T23:02:20.483 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: qatlib-service-25.08.0-2.el9.x86_64 90/103 2026-03-08T23:02:20.486 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 91/103 2026-03-08T23:02:20.490 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 92/103 2026-03-08T23:02:20.493 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : pciutils-3.7.0-7.el9.x86_64 93/103 2026-03-08T23:02:20.496 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 94/103 2026-03-08T23:02:20.496 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_ 95/103 2026-03-08T23:02:20.511 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_ 95/103 2026-03-08T23:02:20.513 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 96/103 2026-03-08T23:02:20.516 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 97/103 2026-03-08T23:02:20.520 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 98/103 2026-03-08T23:02:20.523 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : protobuf-3.14.0-17.el9.x86_64 99/103 2026-03-08T23:02:20.528 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 100/103 2026-03-08T23:02:20.536 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : cryptsetup-2.8.1-3.el9.x86_64 101/103 2026-03-08T23:02:20.541 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : abseil-cpp-20211102.0-4.el9.x86_64 102/103 2026-03-08T23:02:20.541 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 103/103 2026-03-08T23:02:20.646 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 103/103 2026-03-08T23:02:20.646 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : abseil-cpp-20211102.0-4.el9.x86_64 1/103 2026-03-08T23:02:20.646 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 2/103 2026-03-08T23:02:20.646 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 3/103 2026-03-08T23:02:20.646 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-grafana-dashboards-2:19.2.3-678.ge911bdeb.e 4/103 2026-03-08T23:02:20.646 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-immutable-object-cache-2:19.2.3-678.ge911bd 5/103 2026-03-08T23:02:20.646 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 6/103 2026-03-08T23:02:20.646 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noar 7/103 2026-03-08T23:02:20.646 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.no 8/103 2026-03-08T23:02:20.646 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-diskprediction-local-2:19.2.3-678.ge911 9/103 2026-03-08T23:02:20.647 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-modules-core-2:19.2.3-678.ge911bdeb.el9 10/103 2026-03-08T23:02:20.647 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 11/103 2026-03-08T23:02:20.647 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 12/103 2026-03-08T23:02:20.647 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-prometheus-alerts-2:19.2.3-678.ge911bdeb.el 13/103 2026-03-08T23:02:20.647 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 14/103 2026-03-08T23:02:20.647 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 15/103 2026-03-08T23:02:20.647 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : cryptsetup-2.8.1-3.el9.x86_64 16/103 2026-03-08T23:02:20.647 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 17/103 2026-03-08T23:02:20.647 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 18/103 2026-03-08T23:02:20.647 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 19/103 2026-03-08T23:02:20.647 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 20/103 2026-03-08T23:02:20.647 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : grpc-data-1.46.7-10.el9.noarch 21/103 2026-03-08T23:02:20.647 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 22/103 2026-03-08T23:02:20.647 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 23/103 2026-03-08T23:02:20.647 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 24/103 2026-03-08T23:02:20.647 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 25/103 2026-03-08T23:02:20.647 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 26/103 2026-03-08T23:02:20.648 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 27/103 2026-03-08T23:02:20.648 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_ 28/103 2026-03-08T23:02:20.648 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 29/103 2026-03-08T23:02:20.648 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 30/103 2026-03-08T23:02:20.648 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 31/103 2026-03-08T23:02:20.648 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 32/103 2026-03-08T23:02:20.648 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : pciutils-3.7.0-7.el9.x86_64 33/103 2026-03-08T23:02:20.648 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : protobuf-3.14.0-17.el9.x86_64 34/103 2026-03-08T23:02:20.648 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : protobuf-compiler-3.14.0-17.el9.x86_64 35/103 2026-03-08T23:02:20.648 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 36/103 2026-03-08T23:02:20.648 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 37/103 2026-03-08T23:02:20.648 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 38/103 2026-03-08T23:02:20.648 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 39/103 2026-03-08T23:02:20.648 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 40/103 2026-03-08T23:02:20.648 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 41/103 2026-03-08T23:02:20.648 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ceph-common-2:19.2.3-678.ge911bdeb.el9.x 42/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 43/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 44/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 45/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 46/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 47/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 48/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 49/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 50/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-grpcio-1.46.7-10.el9.x86_64 51/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-grpcio-tools-1.46.7-10.el9.x86_64 52/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 53/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 54/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 55/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 56/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 57/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 58/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 59/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 60/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 61/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 62/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 63/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 64/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 65/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 66/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 67/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 68/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 69/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 70/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 71/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 72/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-packaging-20.9-5.el9.noarch 73/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 74/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ply-3.11-14.el9.noarch 75/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 76/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 77/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-protobuf-3.14.0-17.el9.noarch 78/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 79/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 80/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 81/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 82/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyparsing-2.4.7-9.el9.noarch 83/103 2026-03-08T23:02:20.649 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 84/103 2026-03-08T23:02:20.650 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 85/103 2026-03-08T23:02:20.650 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 86/103 2026-03-08T23:02:20.650 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 87/103 2026-03-08T23:02:20.650 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 88/103 2026-03-08T23:02:20.650 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 89/103 2026-03-08T23:02:20.650 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 90/103 2026-03-08T23:02:20.650 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 91/103 2026-03-08T23:02:20.650 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 92/103 2026-03-08T23:02:20.650 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 93/103 2026-03-08T23:02:20.650 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 94/103 2026-03-08T23:02:20.650 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 95/103 2026-03-08T23:02:20.650 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 96/103 2026-03-08T23:02:20.650 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 97/103 2026-03-08T23:02:20.650 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 98/103 2026-03-08T23:02:20.650 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 99/103 2026-03-08T23:02:20.650 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : qatlib-25.08.0-2.el9.x86_64 100/103 2026-03-08T23:02:20.650 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : qatlib-service-25.08.0-2.el9.x86_64 101/103 2026-03-08T23:02:20.650 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : qatzip-libs-1.3.1-1.el9.x86_64 102/103 2026-03-08T23:02:20.727 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 103/103 2026-03-08T23:02:20.727 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:20.727 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-08T23:02:20.727 INFO:teuthology.orchestra.run.vm04.stdout: abseil-cpp-20211102.0-4.el9.x86_64 2026-03-08T23:02:20.727 INFO:teuthology.orchestra.run.vm04.stdout: ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:20.727 INFO:teuthology.orchestra.run.vm04.stdout: ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:20.727 INFO:teuthology.orchestra.run.vm04.stdout: ceph-grafana-dashboards-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T23:02:20.727 INFO:teuthology.orchestra.run.vm04.stdout: ceph-immutable-object-cache-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:20.727 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:20.727 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T23:02:20.727 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-diskprediction-local-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: ceph-prometheus-alerts-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: cryptsetup-2.8.1-3.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: grpc-data-1.46.7-10.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: pciutils-3.7.0-7.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: protobuf-3.14.0-17.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: protobuf-compiler-3.14.0-17.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-grpcio-1.46.7-10.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-grpcio-tools-1.46.7-10.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-08T23:02:20.728 INFO:teuthology.orchestra.run.vm04.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-packaging-20.9-5.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-ply-3.11-14.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-protobuf-3.14.0-17.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyparsing-2.4.7-9.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: qatlib-25.08.0-2.el9.x86_64 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: qatlib-service-25.08.0-2.el9.x86_64 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: qatzip-libs-1.3.1-1.el9.x86_64 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:20.729 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:20.968 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:20.968 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:20.968 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-08T23:02:20.968 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:20.968 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-08T23:02:20.968 INFO:teuthology.orchestra.run.vm04.stdout: cephadm noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 775 k 2026-03-08T23:02:20.968 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:20.968 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-08T23:02:20.968 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:20.968 INFO:teuthology.orchestra.run.vm04.stdout:Remove 1 Package 2026-03-08T23:02:20.968 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:20.968 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 775 k 2026-03-08T23:02:20.968 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-08T23:02:20.970 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-08T23:02:20.970 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-08T23:02:20.972 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-08T23:02:20.972 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-08T23:02:20.989 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-08T23:02:20.989 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 1/1 2026-03-08T23:02:21.105 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 1/1 2026-03-08T23:02:21.153 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 1/1 2026-03-08T23:02:21.153 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:21.153 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-08T23:02:21.153 INFO:teuthology.orchestra.run.vm04.stdout: cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T23:02:21.153 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:21.153 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:21.354 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-immutable-object-cache 2026-03-08T23:02:21.354 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-08T23:02:21.358 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:21.358 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-08T23:02:21.358 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:21.546 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-mgr 2026-03-08T23:02:21.547 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-08T23:02:21.550 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:21.551 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-08T23:02:21.551 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:21.735 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-mgr-dashboard 2026-03-08T23:02:21.736 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-08T23:02:21.739 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:21.739 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-08T23:02:21.740 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:21.936 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-08T23:02:21.936 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-08T23:02:21.940 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:21.940 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-08T23:02:21.940 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:22.134 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-mgr-rook 2026-03-08T23:02:22.134 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-08T23:02:22.138 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:22.138 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-08T23:02:22.138 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:22.332 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-mgr-cephadm 2026-03-08T23:02:22.332 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-08T23:02:22.336 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:22.337 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-08T23:02:22.337 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:22.550 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:22.550 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:22.550 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-08T23:02:22.550 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:22.550 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-08T23:02:22.550 INFO:teuthology.orchestra.run.vm04.stdout: ceph-fuse x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 3.6 M 2026-03-08T23:02:22.550 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:22.550 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-08T23:02:22.550 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:22.550 INFO:teuthology.orchestra.run.vm04.stdout:Remove 1 Package 2026-03-08T23:02:22.550 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:22.550 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 3.6 M 2026-03-08T23:02:22.550 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-08T23:02:22.552 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-08T23:02:22.552 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-08T23:02:22.562 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-08T23:02:22.563 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-08T23:02:22.589 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-08T23:02:22.604 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 1/1 2026-03-08T23:02:22.672 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 1/1 2026-03-08T23:02:22.720 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 1/1 2026-03-08T23:02:22.720 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:22.720 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-08T23:02:22.720 INFO:teuthology.orchestra.run.vm04.stdout: ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:22.720 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:22.720 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:22.918 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-volume 2026-03-08T23:02:22.918 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-08T23:02:22.921 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:22.921 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-08T23:02:22.921 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:23.128 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:23.129 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:23.129 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repo Size 2026-03-08T23:02:23.129 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:23.129 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-08T23:02:23.129 INFO:teuthology.orchestra.run.vm04.stdout: librados-devel x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 456 k 2026-03-08T23:02:23.129 INFO:teuthology.orchestra.run.vm04.stdout:Removing dependent packages: 2026-03-08T23:02:23.129 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-devel x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 153 k 2026-03-08T23:02:23.129 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:23.129 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-08T23:02:23.129 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:23.129 INFO:teuthology.orchestra.run.vm04.stdout:Remove 2 Packages 2026-03-08T23:02:23.129 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:23.129 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 610 k 2026-03-08T23:02:23.129 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-08T23:02:23.131 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-08T23:02:23.131 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-08T23:02:23.141 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-08T23:02:23.141 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-08T23:02:23.170 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-08T23:02:23.174 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libcephfs-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 1/2 2026-03-08T23:02:23.192 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2/2 2026-03-08T23:02:23.257 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2/2 2026-03-08T23:02:23.257 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephfs-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 1/2 2026-03-08T23:02:23.312 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2/2 2026-03-08T23:02:23.312 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:23.312 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-08T23:02:23.312 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:23.312 INFO:teuthology.orchestra.run.vm04.stdout: librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:23.312 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:23.312 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:23.537 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:23.538 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:23.538 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repo Size 2026-03-08T23:02:23.538 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:23.538 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-08T23:02:23.538 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs2 x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 3.0 M 2026-03-08T23:02:23.538 INFO:teuthology.orchestra.run.vm04.stdout:Removing dependent packages: 2026-03-08T23:02:23.538 INFO:teuthology.orchestra.run.vm04.stdout: python3-cephfs x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 514 k 2026-03-08T23:02:23.538 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-08T23:02:23.538 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 187 k 2026-03-08T23:02:23.538 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:23.538 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-08T23:02:23.538 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:23.538 INFO:teuthology.orchestra.run.vm04.stdout:Remove 3 Packages 2026-03-08T23:02:23.538 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:23.538 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 3.7 M 2026-03-08T23:02:23.538 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-08T23:02:23.540 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-08T23:02:23.540 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-08T23:02:23.558 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-08T23:02:23.558 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-08T23:02:23.673 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-08T23:02:23.676 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cephfs-2:19.2.3-678.ge911bdeb.el9.x86_64 1/3 2026-03-08T23:02:23.678 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-ceph-argparse-2:19.2.3-678.ge911bdeb.el9.x86 2/3 2026-03-08T23:02:23.678 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 3/3 2026-03-08T23:02:23.744 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 3/3 2026-03-08T23:02:23.744 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 1/3 2026-03-08T23:02:23.744 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ceph-argparse-2:19.2.3-678.ge911bdeb.el9.x86 2/3 2026-03-08T23:02:23.831 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cephfs-2:19.2.3-678.ge911bdeb.el9.x86_64 3/3 2026-03-08T23:02:23.831 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:23.831 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-08T23:02:23.831 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:23.831 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:23.831 INFO:teuthology.orchestra.run.vm04.stdout: python3-cephfs-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:23.831 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:23.831 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:24.083 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: libcephfs-devel 2026-03-08T23:02:24.083 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-08T23:02:24.086 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:24.087 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-08T23:02:24.087 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:24.281 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout: librados2 x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 12 M 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout:Removing dependent packages: 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout: python3-rados x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 1.1 M 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout: python3-rbd x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 1.1 M 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout: python3-rgw x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 265 k 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout: rbd-fuse x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 227 k 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout: rbd-nbd x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 490 k 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout: libnbd x86_64 1.20.3-4.el9 @appstream 453 k 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout: librbd1 x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 13 M 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-08T23:02:24.283 INFO:teuthology.orchestra.run.vm04.stdout: librgw2 x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 19 M 2026-03-08T23:02:24.284 INFO:teuthology.orchestra.run.vm04.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-08T23:02:24.284 INFO:teuthology.orchestra.run.vm04.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-08T23:02:24.284 INFO:teuthology.orchestra.run.vm04.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-08T23:02:24.284 INFO:teuthology.orchestra.run.vm04.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-08T23:02:24.284 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:24.284 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-08T23:02:24.284 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-08T23:02:24.284 INFO:teuthology.orchestra.run.vm04.stdout:Remove 20 Packages 2026-03-08T23:02:24.284 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:24.284 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 79 M 2026-03-08T23:02:24.284 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-08T23:02:24.287 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-08T23:02:24.287 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-08T23:02:24.310 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-08T23:02:24.310 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-08T23:02:24.352 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-08T23:02:24.355 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : rbd-nbd-2:19.2.3-678.ge911bdeb.el9.x86_64 1/20 2026-03-08T23:02:24.357 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : rbd-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 2/20 2026-03-08T23:02:24.360 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-rgw-2:19.2.3-678.ge911bdeb.el9.x86_64 3/20 2026-03-08T23:02:24.361 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 4/20 2026-03-08T23:02:24.375 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 4/20 2026-03-08T23:02:24.377 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/20 2026-03-08T23:02:24.380 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-rbd-2:19.2.3-678.ge911bdeb.el9.x86_64 6/20 2026-03-08T23:02:24.382 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-rados-2:19.2.3-678.ge911bdeb.el9.x86_64 7/20 2026-03-08T23:02:24.384 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/20 2026-03-08T23:02:24.386 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 9/20 2026-03-08T23:02:24.386 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 10/20 2026-03-08T23:02:24.403 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 10/20 2026-03-08T23:02:24.404 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 11/20 2026-03-08T23:02:24.404 INFO:teuthology.orchestra.run.vm04.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-08T23:02:24.404 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:24.420 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 11/20 2026-03-08T23:02:24.423 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 12/20 2026-03-08T23:02:24.427 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : re2-1:20211101-20.el9.x86_64 13/20 2026-03-08T23:02:24.431 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 14/20 2026-03-08T23:02:24.435 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 15/20 2026-03-08T23:02:24.440 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libnbd-1.20.3-4.el9.x86_64 16/20 2026-03-08T23:02:24.442 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 17/20 2026-03-08T23:02:24.444 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 18/20 2026-03-08T23:02:24.447 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 19/20 2026-03-08T23:02:24.463 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 20/20 2026-03-08T23:02:24.527 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 20/20 2026-03-08T23:02:24.528 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/20 2026-03-08T23:02:24.528 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 2/20 2026-03-08T23:02:24.528 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 3/20 2026-03-08T23:02:24.528 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libnbd-1.20.3-4.el9.x86_64 4/20 2026-03-08T23:02:24.528 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 5/20 2026-03-08T23:02:24.528 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 6/20 2026-03-08T23:02:24.528 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 7/20 2026-03-08T23:02:24.528 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 8/20 2026-03-08T23:02:24.528 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 9/20 2026-03-08T23:02:24.528 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 10/20 2026-03-08T23:02:24.528 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 11/20 2026-03-08T23:02:24.528 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 12/20 2026-03-08T23:02:24.528 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rados-2:19.2.3-678.ge911bdeb.el9.x86_64 13/20 2026-03-08T23:02:24.528 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rbd-2:19.2.3-678.ge911bdeb.el9.x86_64 14/20 2026-03-08T23:02:24.528 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rgw-2:19.2.3-678.ge911bdeb.el9.x86_64 15/20 2026-03-08T23:02:24.528 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 16/20 2026-03-08T23:02:24.528 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 17/20 2026-03-08T23:02:24.528 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-nbd-2:19.2.3-678.ge911bdeb.el9.x86_64 18/20 2026-03-08T23:02:24.528 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : re2-1:20211101-20.el9.x86_64 19/20 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 20/20 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: libnbd-1.20.3-4.el9.x86_64 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: python3-rados-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: python3-rbd-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: python3-rgw-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: rbd-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: rbd-nbd-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: re2-1:20211101-20.el9.x86_64 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-08T23:02:24.589 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:24.832 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: librbd1 2026-03-08T23:02:24.832 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-08T23:02:24.834 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:24.835 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-08T23:02:24.835 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:25.033 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: python3-rados 2026-03-08T23:02:25.033 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-08T23:02:25.036 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:25.036 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-08T23:02:25.036 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:25.217 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: python3-rgw 2026-03-08T23:02:25.217 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-08T23:02:25.220 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:25.220 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-08T23:02:25.220 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:25.393 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: python3-cephfs 2026-03-08T23:02:25.393 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-08T23:02:25.395 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:25.395 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-08T23:02:25.395 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:25.576 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: python3-rbd 2026-03-08T23:02:25.576 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-08T23:02:25.578 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:25.578 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-08T23:02:25.578 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:25.752 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: rbd-fuse 2026-03-08T23:02:25.752 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-08T23:02:25.754 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:25.755 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-08T23:02:25.755 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:25.938 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: rbd-mirror 2026-03-08T23:02:25.938 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-08T23:02:25.940 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:25.941 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-08T23:02:25.941 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:26.126 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: rbd-nbd 2026-03-08T23:02:26.127 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-08T23:02:26.130 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-08T23:02:26.130 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-08T23:02:26.130 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-08T23:02:26.157 DEBUG:teuthology.orchestra.run.vm04:> sudo yum clean all 2026-03-08T23:02:26.300 INFO:teuthology.orchestra.run.vm04.stdout:56 files removed 2026-03-08T23:02:26.331 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-08T23:02:26.359 DEBUG:teuthology.orchestra.run.vm04:> sudo yum clean expire-cache 2026-03-08T23:02:26.525 INFO:teuthology.orchestra.run.vm04.stdout:Cache was expired 2026-03-08T23:02:26.525 INFO:teuthology.orchestra.run.vm04.stdout:0 files removed 2026-03-08T23:02:26.561 DEBUG:teuthology.parallel:result is None 2026-03-08T23:02:26.561 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm04.local 2026-03-08T23:02:26.562 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-08T23:02:26.588 DEBUG:teuthology.orchestra.run.vm04:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-08T23:02:26.655 DEBUG:teuthology.parallel:result is None 2026-03-08T23:02:26.655 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-08T23:02:26.658 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-08T23:02:26.658 DEBUG:teuthology.orchestra.run.vm04:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-08T23:02:26.712 INFO:teuthology.orchestra.run.vm04.stderr:bash: line 1: ntpq: command not found 2026-03-08T23:02:31.722 INFO:teuthology.orchestra.run.vm04.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-08T23:02:31.722 INFO:teuthology.orchestra.run.vm04.stdout:=============================================================================== 2026-03-08T23:02:31.722 INFO:teuthology.orchestra.run.vm04.stdout:^+ ntp2.kernfusion.at 2 6 377 34 -372us[ -372us] +/- 27ms 2026-03-08T23:02:31.722 INFO:teuthology.orchestra.run.vm04.stdout:^+ ntp2.wup-de.hosts.301-mo> 2 7 377 164 +1184us[+1680us] +/- 19ms 2026-03-08T23:02:31.722 INFO:teuthology.orchestra.run.vm04.stdout:^- mail.sassmann.nrw 2 6 377 38 +1258us[+1258us] +/- 44ms 2026-03-08T23:02:31.722 INFO:teuthology.orchestra.run.vm04.stdout:^* ntp3.lwlcom.net 1 7 377 99 -2121us[-1921us] +/- 16ms 2026-03-08T23:02:31.723 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-08T23:02:31.725 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-08T23:02:31.725 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-08T23:02:31.734 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-08T23:02:31.752 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-08T23:02:31.784 INFO:teuthology.task.internal:Duration was 1576.843435 seconds 2026-03-08T23:02:31.784 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-08T23:02:31.820 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-08T23:02:31.821 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-08T23:02:31.866 INFO:teuthology.orchestra.run.vm04.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-08T23:02:32.323 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-08T23:02:32.323 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm04.local 2026-03-08T23:02:32.323 DEBUG:teuthology.orchestra.run.vm04:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-08T23:02:32.354 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-08T23:02:32.354 DEBUG:teuthology.orchestra.run.vm04:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-08T23:02:32.801 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-08T23:02:32.801 DEBUG:teuthology.orchestra.run.vm04:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-08T23:02:32.833 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-08T23:02:32.833 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-08T23:02:32.833 INFO:teuthology.orchestra.run.vm04.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: gzip 0.0% -5 -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-08T23:02:32.833 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-08T23:02:32.833 INFO:teuthology.orchestra.run.vm04.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-08T23:02:32.986 INFO:teuthology.orchestra.run.vm04.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 98.5% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-08T23:02:32.988 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-08T23:02:33.090 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-08T23:02:33.090 DEBUG:teuthology.orchestra.run.vm04:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-08T23:02:33.119 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-08T23:02:33.166 DEBUG:teuthology.orchestra.run.vm04:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:02:33.194 INFO:teuthology.orchestra.run.vm04.stdout:kernel.core_pattern = core 2026-03-08T23:02:33.211 DEBUG:teuthology.orchestra.run.vm04:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:02:33.265 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T23:02:33.265 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-08T23:02:33.292 INFO:teuthology.task.internal:Transferring archived files... 2026-03-08T23:02:33.292 DEBUG:teuthology.misc:Transferring archived files from vm04:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps/276/remote/vm04 2026-03-08T23:02:33.292 DEBUG:teuthology.orchestra.run.vm04:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-08T23:02:33.365 INFO:teuthology.task.internal:Removing archive directory... 2026-03-08T23:02:33.365 DEBUG:teuthology.orchestra.run.vm04:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-08T23:02:33.419 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-08T23:02:33.453 INFO:teuthology.task.internal:Not uploading archives. 2026-03-08T23:02:33.453 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-08T23:02:33.477 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-08T23:02:33.477 DEBUG:teuthology.orchestra.run.vm04:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-08T23:02:33.494 INFO:teuthology.orchestra.run.vm04.stdout: 8532146 0 drwxr-xr-x 2 ubuntu ubuntu 6 Mar 8 23:02 /home/ubuntu/cephtest 2026-03-08T23:02:33.495 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-08T23:02:33.523 INFO:teuthology.run:Summary data: description: rados:standalone/{supported-random-distro$/{centos_latest} workloads/erasure-code} duration: 1576.843435049057 flavor: default owner: kyr success: true 2026-03-08T23:02:33.523 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-08T23:02:33.548 INFO:teuthology.run:pass