2026-03-08T22:35:35.850 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-08T22:35:35.853 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-08T22:35:35.869 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps/275 branch: squid description: rados:standalone/{supported-random-distro$/{centos_latest} workloads/crush} email: null first_in_suite: false flavor: default job_id: '275' last_in_suite: false machine_type: vps name: kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps no_nested_subset: false openstack: - volumes: count: 3 size: 10 os_type: centos os_version: 9.stream overrides: admin_socket: branch: squid ansible.cephlab: branch: main skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: timezone: UTC ceph: conf: mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 osd: debug ms: 1 debug osd: 20 osd mclock iops capacity threshold hdd: 49000 flavor: default log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) sha1: e911bdebe5c8faa3800735d1568fcdca65db60df ceph-deploy: conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: {} install: ceph: flavor: default sha1: e911bdebe5c8faa3800735d1568fcdca65db60df extra_system_packages: deb: - python3-xmltodict - python3-jmespath rpm: - bzip2 - perl-Test-Harness - python3-xmltodict - python3-jmespath selinux: allowlist: - scontext=system_u:system_r:getty_t:s0 workunit: branch: tt-squid sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - mon.a - mgr.x - osd.0 - osd.1 - osd.2 - client.0 seed: 5909 sha1: e911bdebe5c8faa3800735d1568fcdca65db60df sleep_before_teardown: 0 suite: rados:standalone suite_branch: tt-squid suite_path: /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 targets: vm10.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIfNprY4kaRE/oLoEL7/Nr69kJowVUTAA7YBrpRcht+FbJsGY22j6yfKeeP0lR/UQuJ7h2xvwfuEIgL8FCSnTqw= tasks: - install: null - workunit: basedir: qa/standalone clients: all: - crush teuthology: fragments_dropped: [] meta: {} postmerge: [] teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-08_21:49:43 tube: vps user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.611473 2026-03-08T22:35:35.869 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa; will attempt to use it 2026-03-08T22:35:35.869 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks 2026-03-08T22:35:35.870 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-08T22:35:35.870 INFO:teuthology.task.internal:Checking packages... 2026-03-08T22:35:35.870 INFO:teuthology.task.internal:Checking packages for os_type 'centos', flavor 'default' and ceph hash 'e911bdebe5c8faa3800735d1568fcdca65db60df' 2026-03-08T22:35:35.870 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-08T22:35:35.870 INFO:teuthology.packaging:ref: None 2026-03-08T22:35:35.870 INFO:teuthology.packaging:tag: None 2026-03-08T22:35:35.870 INFO:teuthology.packaging:branch: squid 2026-03-08T22:35:35.870 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T22:35:35.870 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=squid 2026-03-08T22:35:36.653 INFO:teuthology.task.internal:Found packages for ceph version 19.2.3-678.ge911bdeb 2026-03-08T22:35:36.654 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-08T22:35:36.654 INFO:teuthology.task.internal:no buildpackages task found 2026-03-08T22:35:36.654 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-08T22:35:36.655 INFO:teuthology.task.internal:Saving configuration 2026-03-08T22:35:36.658 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-08T22:35:36.658 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-08T22:35:36.665 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm10.local', 'description': '/archive/kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps/275', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-08 22:33:35.639846', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:0a', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIfNprY4kaRE/oLoEL7/Nr69kJowVUTAA7YBrpRcht+FbJsGY22j6yfKeeP0lR/UQuJ7h2xvwfuEIgL8FCSnTqw='} 2026-03-08T22:35:36.665 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-08T22:35:36.666 INFO:teuthology.task.internal:roles: ubuntu@vm10.local - ['mon.a', 'mgr.x', 'osd.0', 'osd.1', 'osd.2', 'client.0'] 2026-03-08T22:35:36.666 INFO:teuthology.run_tasks:Running task console_log... 2026-03-08T22:35:36.671 DEBUG:teuthology.task.console_log:vm10 does not support IPMI; excluding 2026-03-08T22:35:36.671 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7f1dd6dd3d00>, signals=[15]) 2026-03-08T22:35:36.671 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-08T22:35:36.672 INFO:teuthology.task.internal:Opening connections... 2026-03-08T22:35:36.672 DEBUG:teuthology.task.internal:connecting to ubuntu@vm10.local 2026-03-08T22:35:36.673 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm10.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-08T22:35:36.730 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-08T22:35:36.732 DEBUG:teuthology.orchestra.run.vm10:> uname -m 2026-03-08T22:35:36.888 INFO:teuthology.orchestra.run.vm10.stdout:x86_64 2026-03-08T22:35:36.888 DEBUG:teuthology.orchestra.run.vm10:> cat /etc/os-release 2026-03-08T22:35:36.942 INFO:teuthology.orchestra.run.vm10.stdout:NAME="CentOS Stream" 2026-03-08T22:35:36.942 INFO:teuthology.orchestra.run.vm10.stdout:VERSION="9" 2026-03-08T22:35:36.942 INFO:teuthology.orchestra.run.vm10.stdout:ID="centos" 2026-03-08T22:35:36.942 INFO:teuthology.orchestra.run.vm10.stdout:ID_LIKE="rhel fedora" 2026-03-08T22:35:36.942 INFO:teuthology.orchestra.run.vm10.stdout:VERSION_ID="9" 2026-03-08T22:35:36.942 INFO:teuthology.orchestra.run.vm10.stdout:PLATFORM_ID="platform:el9" 2026-03-08T22:35:36.942 INFO:teuthology.orchestra.run.vm10.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-08T22:35:36.942 INFO:teuthology.orchestra.run.vm10.stdout:ANSI_COLOR="0;31" 2026-03-08T22:35:36.942 INFO:teuthology.orchestra.run.vm10.stdout:LOGO="fedora-logo-icon" 2026-03-08T22:35:36.942 INFO:teuthology.orchestra.run.vm10.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-08T22:35:36.942 INFO:teuthology.orchestra.run.vm10.stdout:HOME_URL="https://centos.org/" 2026-03-08T22:35:36.942 INFO:teuthology.orchestra.run.vm10.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-08T22:35:36.942 INFO:teuthology.orchestra.run.vm10.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-08T22:35:36.942 INFO:teuthology.orchestra.run.vm10.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-08T22:35:36.943 INFO:teuthology.lock.ops:Updating vm10.local on lock server 2026-03-08T22:35:36.947 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-08T22:35:36.949 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-08T22:35:36.950 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-08T22:35:36.950 DEBUG:teuthology.orchestra.run.vm10:> test '!' -e /home/ubuntu/cephtest 2026-03-08T22:35:36.996 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-08T22:35:36.997 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-08T22:35:36.997 DEBUG:teuthology.orchestra.run.vm10:> test -z $(ls -A /var/lib/ceph) 2026-03-08T22:35:37.051 INFO:teuthology.orchestra.run.vm10.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-08T22:35:37.051 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-08T22:35:37.059 DEBUG:teuthology.orchestra.run.vm10:> test -e /ceph-qa-ready 2026-03-08T22:35:37.108 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:35:37.292 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-08T22:35:37.293 INFO:teuthology.task.internal:Creating test directory... 2026-03-08T22:35:37.293 DEBUG:teuthology.orchestra.run.vm10:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-08T22:35:37.310 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-08T22:35:37.312 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-08T22:35:37.313 INFO:teuthology.task.internal:Creating archive directory... 2026-03-08T22:35:37.313 DEBUG:teuthology.orchestra.run.vm10:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-08T22:35:37.367 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-08T22:35:37.369 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-08T22:35:37.369 DEBUG:teuthology.orchestra.run.vm10:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-08T22:35:37.420 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:35:37.421 DEBUG:teuthology.orchestra.run.vm10:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-08T22:35:37.486 INFO:teuthology.orchestra.run.vm10.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:35:37.494 INFO:teuthology.orchestra.run.vm10.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:35:37.496 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-08T22:35:37.497 INFO:teuthology.task.internal:Configuring sudo... 2026-03-08T22:35:37.497 DEBUG:teuthology.orchestra.run.vm10:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-08T22:35:37.560 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-08T22:35:37.562 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-08T22:35:37.562 DEBUG:teuthology.orchestra.run.vm10:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-08T22:35:37.615 DEBUG:teuthology.orchestra.run.vm10:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-08T22:35:37.677 DEBUG:teuthology.orchestra.run.vm10:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-08T22:35:37.734 DEBUG:teuthology.orchestra.run.vm10:> set -ex 2026-03-08T22:35:37.734 DEBUG:teuthology.orchestra.run.vm10:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-08T22:35:37.791 DEBUG:teuthology.orchestra.run.vm10:> sudo service rsyslog restart 2026-03-08T22:35:37.860 INFO:teuthology.orchestra.run.vm10.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-08T22:35:38.097 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-08T22:35:38.098 INFO:teuthology.task.internal:Starting timer... 2026-03-08T22:35:38.099 INFO:teuthology.run_tasks:Running task pcp... 2026-03-08T22:35:38.101 INFO:teuthology.run_tasks:Running task selinux... 2026-03-08T22:35:38.103 DEBUG:teuthology.task:Applying overrides for task selinux: {'allowlist': ['scontext=system_u:system_r:getty_t:s0']} 2026-03-08T22:35:38.103 INFO:teuthology.task.selinux:Excluding vm10: VMs are not yet supported 2026-03-08T22:35:38.103 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-08T22:35:38.103 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-08T22:35:38.103 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-08T22:35:38.103 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-08T22:35:38.105 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'timezone': 'UTC'}} 2026-03-08T22:35:38.105 DEBUG:teuthology.repo_utils:Setting repo remote to https://github.com/ceph/ceph-cm-ansible.git 2026-03-08T22:35:38.106 INFO:teuthology.repo_utils:Fetching github.com_ceph_ceph-cm-ansible_main from origin 2026-03-08T22:35:38.853 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main to origin/main 2026-03-08T22:35:38.859 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-08T22:35:38.860 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "timezone": "UTC"}' -i /tmp/teuth_ansible_inventorypbx3v5ro --limit vm10.local /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-08T22:37:19.986 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm10.local')] 2026-03-08T22:37:19.986 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm10.local' 2026-03-08T22:37:19.987 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm10.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-08T22:37:20.053 DEBUG:teuthology.orchestra.run.vm10:> true 2026-03-08T22:37:20.129 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm10.local' 2026-03-08T22:37:20.129 INFO:teuthology.run_tasks:Running task clock... 2026-03-08T22:37:20.132 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-08T22:37:20.132 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-08T22:37:20.132 DEBUG:teuthology.orchestra.run.vm10:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-08T22:37:20.209 INFO:teuthology.orchestra.run.vm10.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-08T22:37:20.224 INFO:teuthology.orchestra.run.vm10.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-08T22:37:20.261 INFO:teuthology.orchestra.run.vm10.stderr:sudo: ntpd: command not found 2026-03-08T22:37:20.273 INFO:teuthology.orchestra.run.vm10.stdout:506 Cannot talk to daemon 2026-03-08T22:37:20.288 INFO:teuthology.orchestra.run.vm10.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-08T22:37:20.305 INFO:teuthology.orchestra.run.vm10.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-08T22:37:20.363 INFO:teuthology.orchestra.run.vm10.stderr:bash: line 1: ntpq: command not found 2026-03-08T22:37:20.367 INFO:teuthology.orchestra.run.vm10.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-08T22:37:20.367 INFO:teuthology.orchestra.run.vm10.stdout:=============================================================================== 2026-03-08T22:37:20.367 INFO:teuthology.orchestra.run.vm10.stdout:^? stratum2-3.NTP.TechFak.U> 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-08T22:37:20.367 INFO:teuthology.orchestra.run.vm10.stdout:^? kronos.mailus.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-08T22:37:20.367 INFO:teuthology.orchestra.run.vm10.stdout:^? mail.light-speed.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-08T22:37:20.367 INFO:teuthology.orchestra.run.vm10.stdout:^? pve2.h4x-gamers.top 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-08T22:37:20.367 INFO:teuthology.run_tasks:Running task install... 2026-03-08T22:37:20.369 DEBUG:teuthology.task.install:project ceph 2026-03-08T22:37:20.369 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'}, 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-08T22:37:20.369 DEBUG:teuthology.task.install:config {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-08T22:37:20.369 INFO:teuthology.task.install:Using flavor: default 2026-03-08T22:37:20.372 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'ceph-volume', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'ceph-volume', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-08T22:37:20.372 INFO:teuthology.task.install:extra packages: [] 2026-03-08T22:37:20.372 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': None, 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': [], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': None, 'wait_for_package': False} 2026-03-08T22:37:20.372 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T22:37:21.069 INFO:teuthology.task.install.rpm:Pulling from https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/ 2026-03-08T22:37:21.069 INFO:teuthology.task.install.rpm:Package version is 19.2.3-678.ge911bdeb 2026-03-08T22:37:21.595 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-08T22:37:21.595 DEBUG:teuthology.orchestra.run.vm10:> set -ex 2026-03-08T22:37:21.595 DEBUG:teuthology.orchestra.run.vm10:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-08T22:37:21.631 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, ceph-volume, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-08T22:37:21.631 DEBUG:teuthology.orchestra.run.vm10:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;sha1/e911bdebe5c8faa3800735d1568fcdca65db60df/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-08T22:37:21.707 DEBUG:teuthology.orchestra.run.vm10:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-08T22:37:21.794 DEBUG:teuthology.orchestra.run.vm10:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-08T22:37:21.818 INFO:teuthology.orchestra.run.vm10.stdout:check_obsoletes = 1 2026-03-08T22:37:21.823 DEBUG:teuthology.orchestra.run.vm10:> sudo yum clean all 2026-03-08T22:37:22.034 INFO:teuthology.orchestra.run.vm10.stdout:41 files removed 2026-03-08T22:37:22.058 DEBUG:teuthology.orchestra.run.vm10:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse ceph-volume librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-08T22:37:23.563 INFO:teuthology.orchestra.run.vm10.stdout:ceph packages for x86_64 65 kB/s | 84 kB 00:01 2026-03-08T22:37:24.603 INFO:teuthology.orchestra.run.vm10.stdout:ceph noarch packages 12 kB/s | 12 kB 00:01 2026-03-08T22:37:25.555 INFO:teuthology.orchestra.run.vm10.stdout:ceph source packages 2.0 kB/s | 1.9 kB 00:00 2026-03-08T22:37:26.294 INFO:teuthology.orchestra.run.vm10.stdout:CentOS Stream 9 - BaseOS 12 MB/s | 8.9 MB 00:00 2026-03-08T22:37:28.174 INFO:teuthology.orchestra.run.vm10.stdout:CentOS Stream 9 - AppStream 26 MB/s | 27 MB 00:01 2026-03-08T22:37:36.199 INFO:teuthology.orchestra.run.vm10.stdout:CentOS Stream 9 - CRB 1.7 MB/s | 8.0 MB 00:04 2026-03-08T22:37:37.449 INFO:teuthology.orchestra.run.vm10.stdout:CentOS Stream 9 - Extras packages 102 kB/s | 20 kB 00:00 2026-03-08T22:37:38.548 INFO:teuthology.orchestra.run.vm10.stdout:Extra Packages for Enterprise Linux 20 MB/s | 20 MB 00:01 2026-03-08T22:37:43.954 INFO:teuthology.orchestra.run.vm10.stdout:lab-extras 64 kB/s | 50 kB 00:00 2026-03-08T22:37:45.671 INFO:teuthology.orchestra.run.vm10.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-08T22:37:45.672 INFO:teuthology.orchestra.run.vm10.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-08T22:37:45.676 INFO:teuthology.orchestra.run.vm10.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-08T22:37:45.677 INFO:teuthology.orchestra.run.vm10.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-08T22:37:45.710 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout:====================================================================================== 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: Package Arch Version Repository Size 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout:====================================================================================== 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout:Installing: 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ceph x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 6.5 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ceph-base x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 5.5 M 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ceph-fuse x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 1.2 M 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ceph-immutable-object-cache x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 145 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 1.1 M 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-cephadm noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 150 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-dashboard noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 3.8 M 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-diskprediction-local noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 7.4 M 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-rook noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 49 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ceph-radosgw x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 11 M 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ceph-test x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 50 M 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ceph-volume noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 299 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: cephadm noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 769 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: libcephfs-devel x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 34 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: libcephfs2 x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 1.0 M 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: librados-devel x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 127 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: python3-cephfs x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 165 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: python3-rados x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 323 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: python3-rbd x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 303 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: python3-rgw x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 100 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: rbd-fuse x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 85 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: rbd-mirror x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 3.1 M 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: rbd-nbd x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 171 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout:Upgrading: 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: librados2 x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 3.4 M 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: librbd1 x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 3.2 M 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout:Installing dependencies: 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: abseil-cpp x86_64 20211102.0-4.el9 epel 551 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ceph-common x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 22 M 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ceph-grafana-dashboards noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 31 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mds x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 2.4 M 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-modules-core noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 253 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mon x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 4.7 M 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ceph-osd x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 17 M 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ceph-prometheus-alerts noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 17 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ceph-selinux x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 25 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: cryptsetup x86_64 2.8.1-3.el9 baseos 351 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: grpc-data noarch 1.46.7-10.el9 epel 19 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-08T22:37:45.715 INFO:teuthology.orchestra.run.vm10.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: libcephsqlite x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 163 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: libnbd x86_64 1.20.3-4.el9 appstream 164 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: libradosstriper1 x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 503 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: librgw2 x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 5.4 M 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: lua x86_64 5.4.4-4.el9 appstream 188 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: lua-devel x86_64 5.4.4-4.el9 crb 22 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: luarocks noarch 3.9.2-5.el9 epel 151 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: pciutils x86_64 3.7.0-7.el9 baseos 93 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: protobuf x86_64 3.14.0-17.el9 appstream 1.0 M 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: protobuf-compiler x86_64 3.14.0-17.el9 crb 862 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-ceph-argparse x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 45 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-ceph-common x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 142 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-grpcio x86_64 1.46.7-10.el9 epel 2.0 M 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-grpcio-tools x86_64 1.46.7-10.el9 epel 144 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-packaging noarch 20.9-5.el9 appstream 77 k 2026-03-08T22:37:45.716 INFO:teuthology.orchestra.run.vm10.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-protobuf noarch 3.14.0-17.el9 appstream 267 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-pyparsing noarch 2.4.7-9.el9 baseos 150 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: qatlib x86_64 25.08.0-2.el9 appstream 240 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: qatzip-libs x86_64 1.3.1-1.el9 appstream 66 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: unzip x86_64 6.0-59.el9 baseos 182 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: zip x86_64 3.0-35.el9 baseos 266 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout:Installing weak dependencies: 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: qatlib-service x86_64 25.08.0-2.el9 appstream 37 k 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout:Transaction Summary 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout:====================================================================================== 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout:Install 135 Packages 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout:Upgrade 2 Packages 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout:Total download size: 210 M 2026-03-08T22:37:45.717 INFO:teuthology.orchestra.run.vm10.stdout:Downloading Packages: 2026-03-08T22:37:47.468 INFO:teuthology.orchestra.run.vm10.stdout:(1/137): ceph-19.2.3-678.ge911bdeb.el9.x86_64.r 14 kB/s | 6.5 kB 00:00 2026-03-08T22:37:48.310 INFO:teuthology.orchestra.run.vm10.stdout:(2/137): ceph-fuse-19.2.3-678.ge911bdeb.el9.x86 1.4 MB/s | 1.2 MB 00:00 2026-03-08T22:37:48.437 INFO:teuthology.orchestra.run.vm10.stdout:(3/137): ceph-immutable-object-cache-19.2.3-678 1.1 MB/s | 145 kB 00:00 2026-03-08T22:37:48.472 INFO:teuthology.orchestra.run.vm10.stdout:(4/137): ceph-base-19.2.3-678.ge911bdeb.el9.x86 3.7 MB/s | 5.5 MB 00:01 2026-03-08T22:37:48.600 INFO:teuthology.orchestra.run.vm10.stdout:(5/137): ceph-mgr-19.2.3-678.ge911bdeb.el9.x86_ 8.4 MB/s | 1.1 MB 00:00 2026-03-08T22:37:48.700 INFO:teuthology.orchestra.run.vm10.stdout:(6/137): ceph-mds-19.2.3-678.ge911bdeb.el9.x86_ 9.2 MB/s | 2.4 MB 00:00 2026-03-08T22:37:48.990 INFO:teuthology.orchestra.run.vm10.stdout:(7/137): ceph-mon-19.2.3-678.ge911bdeb.el9.x86_ 12 MB/s | 4.7 MB 00:00 2026-03-08T22:37:49.661 INFO:teuthology.orchestra.run.vm10.stdout:(8/137): ceph-common-19.2.3-678.ge911bdeb.el9.x 8.2 MB/s | 22 MB 00:02 2026-03-08T22:37:49.842 INFO:teuthology.orchestra.run.vm10.stdout:(9/137): ceph-radosgw-19.2.3-678.ge911bdeb.el9. 13 MB/s | 11 MB 00:00 2026-03-08T22:37:49.890 INFO:teuthology.orchestra.run.vm10.stdout:(10/137): ceph-selinux-19.2.3-678.ge911bdeb.el9 109 kB/s | 25 kB 00:00 2026-03-08T22:37:50.021 INFO:teuthology.orchestra.run.vm10.stdout:(11/137): ceph-osd-19.2.3-678.ge911bdeb.el9.x86 13 MB/s | 17 MB 00:01 2026-03-08T22:37:50.022 INFO:teuthology.orchestra.run.vm10.stdout:(12/137): libcephfs-devel-19.2.3-678.ge911bdeb. 253 kB/s | 34 kB 00:00 2026-03-08T22:37:50.148 INFO:teuthology.orchestra.run.vm10.stdout:(13/137): libcephfs2-19.2.3-678.ge911bdeb.el9.x 7.7 MB/s | 1.0 MB 00:00 2026-03-08T22:37:50.149 INFO:teuthology.orchestra.run.vm10.stdout:(14/137): libcephsqlite-19.2.3-678.ge911bdeb.el 1.3 MB/s | 163 kB 00:00 2026-03-08T22:37:50.278 INFO:teuthology.orchestra.run.vm10.stdout:(15/137): librados-devel-19.2.3-678.ge911bdeb.e 979 kB/s | 127 kB 00:00 2026-03-08T22:37:50.281 INFO:teuthology.orchestra.run.vm10.stdout:(16/137): libradosstriper1-19.2.3-678.ge911bdeb 3.7 MB/s | 503 kB 00:00 2026-03-08T22:37:50.475 INFO:teuthology.orchestra.run.vm10.stdout:(17/137): python3-ceph-argparse-19.2.3-678.ge91 232 kB/s | 45 kB 00:00 2026-03-08T22:37:50.595 INFO:teuthology.orchestra.run.vm10.stdout:(18/137): python3-ceph-common-19.2.3-678.ge911b 1.2 MB/s | 142 kB 00:00 2026-03-08T22:37:50.660 INFO:teuthology.orchestra.run.vm10.stdout:(19/137): librgw2-19.2.3-678.ge911bdeb.el9.x86_ 14 MB/s | 5.4 MB 00:00 2026-03-08T22:37:50.715 INFO:teuthology.orchestra.run.vm10.stdout:(20/137): python3-cephfs-19.2.3-678.ge911bdeb.e 1.3 MB/s | 165 kB 00:00 2026-03-08T22:37:50.781 INFO:teuthology.orchestra.run.vm10.stdout:(21/137): python3-rados-19.2.3-678.ge911bdeb.el 2.6 MB/s | 323 kB 00:00 2026-03-08T22:37:50.837 INFO:teuthology.orchestra.run.vm10.stdout:(22/137): python3-rbd-19.2.3-678.ge911bdeb.el9. 2.4 MB/s | 303 kB 00:00 2026-03-08T22:37:50.899 INFO:teuthology.orchestra.run.vm10.stdout:(23/137): python3-rgw-19.2.3-678.ge911bdeb.el9. 846 kB/s | 100 kB 00:00 2026-03-08T22:37:50.956 INFO:teuthology.orchestra.run.vm10.stdout:(24/137): rbd-fuse-19.2.3-678.ge911bdeb.el9.x86 713 kB/s | 85 kB 00:00 2026-03-08T22:37:51.076 INFO:teuthology.orchestra.run.vm10.stdout:(25/137): rbd-nbd-19.2.3-678.ge911bdeb.el9.x86_ 1.4 MB/s | 171 kB 00:00 2026-03-08T22:37:51.153 INFO:teuthology.orchestra.run.vm10.stdout:(26/137): rbd-mirror-19.2.3-678.ge911bdeb.el9.x 12 MB/s | 3.1 MB 00:00 2026-03-08T22:37:51.196 INFO:teuthology.orchestra.run.vm10.stdout:(27/137): ceph-grafana-dashboards-19.2.3-678.ge 260 kB/s | 31 kB 00:00 2026-03-08T22:37:51.328 INFO:teuthology.orchestra.run.vm10.stdout:(28/137): ceph-mgr-cephadm-19.2.3-678.ge911bdeb 861 kB/s | 150 kB 00:00 2026-03-08T22:37:51.516 INFO:teuthology.orchestra.run.vm10.stdout:(29/137): ceph-mgr-dashboard-19.2.3-678.ge911bd 12 MB/s | 3.8 MB 00:00 2026-03-08T22:37:51.671 INFO:teuthology.orchestra.run.vm10.stdout:(30/137): ceph-mgr-modules-core-19.2.3-678.ge91 1.6 MB/s | 253 kB 00:00 2026-03-08T22:37:51.824 INFO:teuthology.orchestra.run.vm10.stdout:(31/137): ceph-mgr-rook-19.2.3-678.ge911bdeb.el 322 kB/s | 49 kB 00:00 2026-03-08T22:37:51.855 INFO:teuthology.orchestra.run.vm10.stdout:(32/137): ceph-mgr-diskprediction-local-19.2.3- 14 MB/s | 7.4 MB 00:00 2026-03-08T22:37:51.944 INFO:teuthology.orchestra.run.vm10.stdout:(33/137): ceph-prometheus-alerts-19.2.3-678.ge9 141 kB/s | 17 kB 00:00 2026-03-08T22:37:51.976 INFO:teuthology.orchestra.run.vm10.stdout:(34/137): ceph-volume-19.2.3-678.ge911bdeb.el9. 2.4 MB/s | 299 kB 00:00 2026-03-08T22:37:52.071 INFO:teuthology.orchestra.run.vm10.stdout:(35/137): cephadm-19.2.3-678.ge911bdeb.el9.noar 5.9 MB/s | 769 kB 00:00 2026-03-08T22:37:52.782 INFO:teuthology.orchestra.run.vm10.stdout:(36/137): ledmon-libs-1.1.0-3.el9.x86_64.rpm 57 kB/s | 40 kB 00:00 2026-03-08T22:37:53.124 INFO:teuthology.orchestra.run.vm10.stdout:(37/137): ceph-test-19.2.3-678.ge911bdeb.el9.x8 15 MB/s | 50 MB 00:03 2026-03-08T22:37:53.332 INFO:teuthology.orchestra.run.vm10.stdout:(38/137): cryptsetup-2.8.1-3.el9.x86_64.rpm 259 kB/s | 351 kB 00:01 2026-03-08T22:37:53.709 INFO:teuthology.orchestra.run.vm10.stdout:(39/137): libconfig-1.7.2-9.el9.x86_64.rpm 78 kB/s | 72 kB 00:00 2026-03-08T22:37:53.868 INFO:teuthology.orchestra.run.vm10.stdout:(40/137): mailcap-2.1.49-5.el9.noarch.rpm 209 kB/s | 33 kB 00:00 2026-03-08T22:37:54.112 INFO:teuthology.orchestra.run.vm10.stdout:(41/137): pciutils-3.7.0-7.el9.x86_64.rpm 381 kB/s | 93 kB 00:00 2026-03-08T22:37:54.324 INFO:teuthology.orchestra.run.vm10.stdout:(42/137): libquadmath-11.5.0-14.el9.x86_64.rpm 186 kB/s | 184 kB 00:00 2026-03-08T22:37:54.597 INFO:teuthology.orchestra.run.vm10.stdout:(43/137): python3-cffi-1.14.5-5.el9.x86_64.rpm 522 kB/s | 253 kB 00:00 2026-03-08T22:37:54.716 INFO:teuthology.orchestra.run.vm10.stdout:(44/137): libgfortran-11.5.0-14.el9.x86_64.rpm 499 kB/s | 794 kB 00:01 2026-03-08T22:37:55.182 INFO:teuthology.orchestra.run.vm10.stdout:(45/137): python3-ply-3.11-14.el9.noarch.rpm 182 kB/s | 106 kB 00:00 2026-03-08T22:37:55.865 INFO:teuthology.orchestra.run.vm10.stdout:(46/137): python3-pycparser-2.20-6.el9.noarch.r 118 kB/s | 135 kB 00:01 2026-03-08T22:37:56.089 INFO:teuthology.orchestra.run.vm10.stdout:(47/137): python3-pyparsing-2.4.7-9.el9.noarch. 166 kB/s | 150 kB 00:00 2026-03-08T22:37:56.116 INFO:teuthology.orchestra.run.vm10.stdout:(48/137): python3-requests-2.25.1-10.el9.noarch 504 kB/s | 126 kB 00:00 2026-03-08T22:37:56.544 INFO:teuthology.orchestra.run.vm10.stdout:(49/137): python3-cryptography-36.0.1-5.el9.x86 575 kB/s | 1.2 MB 00:02 2026-03-08T22:37:57.924 INFO:teuthology.orchestra.run.vm10.stdout:(50/137): zip-3.0-35.el9.x86_64.rpm 192 kB/s | 266 kB 00:01 2026-03-08T22:37:58.123 INFO:teuthology.orchestra.run.vm10.stdout:(51/137): python3-urllib3-1.26.5-7.el9.noarch.r 107 kB/s | 218 kB 00:02 2026-03-08T22:37:58.342 INFO:teuthology.orchestra.run.vm10.stdout:(52/137): flexiblas-3.0.4-9.el9.x86_64.rpm 136 kB/s | 30 kB 00:00 2026-03-08T22:37:58.403 INFO:teuthology.orchestra.run.vm10.stdout:(53/137): boost-program-options-1.75.0-13.el9.x 218 kB/s | 104 kB 00:00 2026-03-08T22:37:58.467 INFO:teuthology.orchestra.run.vm10.stdout:(54/137): unzip-6.0-59.el9.x86_64.rpm 77 kB/s | 182 kB 00:02 2026-03-08T22:37:58.475 INFO:teuthology.orchestra.run.vm10.stdout:(55/137): flexiblas-openblas-openmp-3.0.4-9.el9 205 kB/s | 15 kB 00:00 2026-03-08T22:37:58.620 INFO:teuthology.orchestra.run.vm10.stdout:(56/137): libpmemobj-1.12.1-1.el9.x86_64.rpm 1.1 MB/s | 160 kB 00:00 2026-03-08T22:37:58.693 INFO:teuthology.orchestra.run.vm10.stdout:(57/137): librabbitmq-0.11.0-7.el9.x86_64.rpm 621 kB/s | 45 kB 00:00 2026-03-08T22:37:58.828 INFO:teuthology.orchestra.run.vm10.stdout:(58/137): flexiblas-netlib-3.0.4-9.el9.x86_64.r 6.1 MB/s | 3.0 MB 00:00 2026-03-08T22:37:58.829 INFO:teuthology.orchestra.run.vm10.stdout:(59/137): libnbd-1.20.3-4.el9.x86_64.rpm 453 kB/s | 164 kB 00:00 2026-03-08T22:37:58.862 INFO:teuthology.orchestra.run.vm10.stdout:(60/137): librdkafka-1.6.1-102.el9.x86_64.rpm 3.8 MB/s | 662 kB 00:00 2026-03-08T22:37:58.898 INFO:teuthology.orchestra.run.vm10.stdout:(61/137): libstoragemgmt-1.10.1-1.el9.x86_64.rp 3.4 MB/s | 246 kB 00:00 2026-03-08T22:37:58.940 INFO:teuthology.orchestra.run.vm10.stdout:(62/137): lttng-ust-2.12.0-6.el9.x86_64.rpm 3.7 MB/s | 292 kB 00:00 2026-03-08T22:37:58.968 INFO:teuthology.orchestra.run.vm10.stdout:(63/137): lua-5.4.4-4.el9.x86_64.rpm 2.7 MB/s | 188 kB 00:00 2026-03-08T22:37:58.971 INFO:teuthology.orchestra.run.vm10.stdout:(64/137): libxslt-1.1.34-12.el9.x86_64.rpm 1.6 MB/s | 233 kB 00:00 2026-03-08T22:37:59.010 INFO:teuthology.orchestra.run.vm10.stdout:(65/137): openblas-0.3.29-1.el9.x86_64.rpm 607 kB/s | 42 kB 00:00 2026-03-08T22:37:59.144 INFO:teuthology.orchestra.run.vm10.stdout:(66/137): openblas-openmp-0.3.29-1.el9.x86_64.r 30 MB/s | 5.3 MB 00:00 2026-03-08T22:37:59.147 INFO:teuthology.orchestra.run.vm10.stdout:(67/137): protobuf-3.14.0-17.el9.x86_64.rpm 5.7 MB/s | 1.0 MB 00:00 2026-03-08T22:37:59.215 INFO:teuthology.orchestra.run.vm10.stdout:(68/137): python3-devel-3.9.25-3.el9.x86_64.rpm 3.4 MB/s | 244 kB 00:00 2026-03-08T22:37:59.221 INFO:teuthology.orchestra.run.vm10.stdout:(69/137): python3-jinja2-2.11.3-8.el9.noarch.rp 3.3 MB/s | 249 kB 00:00 2026-03-08T22:37:59.283 INFO:teuthology.orchestra.run.vm10.stdout:(70/137): python3-jmespath-1.0.1-1.el9.noarch.r 694 kB/s | 48 kB 00:00 2026-03-08T22:37:59.305 INFO:teuthology.orchestra.run.vm10.stdout:(71/137): python3-babel-2.9.1-2.el9.noarch.rpm 20 MB/s | 6.0 MB 00:00 2026-03-08T22:37:59.306 INFO:teuthology.orchestra.run.vm10.stdout:(72/137): python3-libstoragemgmt-1.10.1-1.el9.x 2.0 MB/s | 177 kB 00:00 2026-03-08T22:37:59.353 INFO:teuthology.orchestra.run.vm10.stdout:(73/137): python3-mako-1.1.4-6.el9.noarch.rpm 2.4 MB/s | 172 kB 00:00 2026-03-08T22:37:59.374 INFO:teuthology.orchestra.run.vm10.stdout:(74/137): python3-markupsafe-1.1.1-12.el9.x86_6 504 kB/s | 35 kB 00:00 2026-03-08T22:37:59.436 INFO:teuthology.orchestra.run.vm10.stdout:(75/137): python3-numpy-f2py-1.23.5-2.el9.x86_6 5.3 MB/s | 442 kB 00:00 2026-03-08T22:37:59.444 INFO:teuthology.orchestra.run.vm10.stdout:(76/137): python3-packaging-20.9-5.el9.noarch.r 1.1 MB/s | 77 kB 00:00 2026-03-08T22:37:59.507 INFO:teuthology.orchestra.run.vm10.stdout:(77/137): python3-protobuf-3.14.0-17.el9.noarch 3.7 MB/s | 267 kB 00:00 2026-03-08T22:37:59.515 INFO:teuthology.orchestra.run.vm10.stdout:(78/137): python3-pyasn1-0.4.8-7.el9.noarch.rpm 2.2 MB/s | 157 kB 00:00 2026-03-08T22:37:59.543 INFO:teuthology.orchestra.run.vm10.stdout:(79/137): python3-numpy-1.23.5-2.el9.x86_64.rpm 26 MB/s | 6.1 MB 00:00 2026-03-08T22:37:59.578 INFO:teuthology.orchestra.run.vm10.stdout:(80/137): python3-pyasn1-modules-0.4.8-7.el9.no 3.8 MB/s | 277 kB 00:00 2026-03-08T22:37:59.584 INFO:teuthology.orchestra.run.vm10.stdout:(81/137): python3-requests-oauthlib-1.3.0-12.el 782 kB/s | 54 kB 00:00 2026-03-08T22:37:59.645 INFO:teuthology.orchestra.run.vm10.stdout:(82/137): python3-toml-0.10.2-6.el9.noarch.rpm 617 kB/s | 42 kB 00:00 2026-03-08T22:37:59.655 INFO:teuthology.orchestra.run.vm10.stdout:(83/137): qatlib-25.08.0-2.el9.x86_64.rpm 3.3 MB/s | 240 kB 00:00 2026-03-08T22:37:59.714 INFO:teuthology.orchestra.run.vm10.stdout:(84/137): qatlib-service-25.08.0-2.el9.x86_64.r 541 kB/s | 37 kB 00:00 2026-03-08T22:37:59.724 INFO:teuthology.orchestra.run.vm10.stdout:(85/137): qatzip-libs-1.3.1-1.el9.x86_64.rpm 964 kB/s | 66 kB 00:00 2026-03-08T22:37:59.829 INFO:teuthology.orchestra.run.vm10.stdout:(86/137): socat-1.7.4.1-8.el9.x86_64.rpm 2.6 MB/s | 303 kB 00:00 2026-03-08T22:37:59.875 INFO:teuthology.orchestra.run.vm10.stdout:(87/137): xmlstarlet-1.6.1-20.el9.x86_64.rpm 423 kB/s | 64 kB 00:00 2026-03-08T22:38:00.156 INFO:teuthology.orchestra.run.vm10.stdout:(88/137): python3-scipy-1.9.3-2.el9.x86_64.rpm 31 MB/s | 19 MB 00:00 2026-03-08T22:38:00.188 INFO:teuthology.orchestra.run.vm10.stdout:(89/137): abseil-cpp-20211102.0-4.el9.x86_64.rp 17 MB/s | 551 kB 00:00 2026-03-08T22:38:00.197 INFO:teuthology.orchestra.run.vm10.stdout:(90/137): gperftools-libs-2.9.1-3.el9.x86_64.rp 36 MB/s | 308 kB 00:00 2026-03-08T22:38:00.199 INFO:teuthology.orchestra.run.vm10.stdout:(91/137): grpc-data-1.46.7-10.el9.noarch.rpm 8.0 MB/s | 19 kB 00:00 2026-03-08T22:38:00.260 INFO:teuthology.orchestra.run.vm10.stdout:(92/137): libarrow-9.0.0-15.el9.x86_64.rpm 73 MB/s | 4.4 MB 00:00 2026-03-08T22:38:00.263 INFO:teuthology.orchestra.run.vm10.stdout:(93/137): libarrow-doc-9.0.0-15.el9.noarch.rpm 9.5 MB/s | 25 kB 00:00 2026-03-08T22:38:00.265 INFO:teuthology.orchestra.run.vm10.stdout:(94/137): liboath-2.6.12-1.el9.x86_64.rpm 18 MB/s | 49 kB 00:00 2026-03-08T22:38:00.269 INFO:teuthology.orchestra.run.vm10.stdout:(95/137): libunwind-1.6.2-1.el9.x86_64.rpm 22 MB/s | 67 kB 00:00 2026-03-08T22:38:00.273 INFO:teuthology.orchestra.run.vm10.stdout:(96/137): luarocks-3.9.2-5.el9.noarch.rpm 38 MB/s | 151 kB 00:00 2026-03-08T22:38:00.277 INFO:teuthology.orchestra.run.vm10.stdout:(97/137): lua-devel-5.4.4-4.el9.x86_64.rpm 50 kB/s | 22 kB 00:00 2026-03-08T22:38:00.288 INFO:teuthology.orchestra.run.vm10.stdout:(98/137): parquet-libs-9.0.0-15.el9.x86_64.rpm 56 MB/s | 838 kB 00:00 2026-03-08T22:38:00.290 INFO:teuthology.orchestra.run.vm10.stdout:(99/137): python3-autocommand-2.2.2-8.el9.noarc 12 MB/s | 29 kB 00:00 2026-03-08T22:38:00.294 INFO:teuthology.orchestra.run.vm10.stdout:(100/137): python3-backports-tarfile-1.2.0-1.el 17 MB/s | 60 kB 00:00 2026-03-08T22:38:00.296 INFO:teuthology.orchestra.run.vm10.stdout:(101/137): python3-bcrypt-3.2.2-1.el9.x86_64.rp 17 MB/s | 43 kB 00:00 2026-03-08T22:38:00.299 INFO:teuthology.orchestra.run.vm10.stdout:(102/137): python3-cachetools-4.2.4-1.el9.noarc 12 MB/s | 32 kB 00:00 2026-03-08T22:38:00.301 INFO:teuthology.orchestra.run.vm10.stdout:(103/137): python3-certifi-2023.05.07-4.el9.noa 6.6 MB/s | 14 kB 00:00 2026-03-08T22:38:00.305 INFO:teuthology.orchestra.run.vm10.stdout:(104/137): python3-asyncssh-2.13.2-5.el9.noarch 19 MB/s | 548 kB 00:00 2026-03-08T22:38:00.306 INFO:teuthology.orchestra.run.vm10.stdout:(105/137): python3-cheroot-10.0.1-4.el9.noarch. 40 MB/s | 173 kB 00:00 2026-03-08T22:38:00.311 INFO:teuthology.orchestra.run.vm10.stdout:(106/137): python3-google-auth-2.45.0-1.el9.noa 50 MB/s | 254 kB 00:00 2026-03-08T22:38:00.321 INFO:teuthology.orchestra.run.vm10.stdout:(107/137): python3-cherrypy-18.6.1-2.el9.noarch 22 MB/s | 358 kB 00:00 2026-03-08T22:38:00.342 INFO:teuthology.orchestra.run.vm10.stdout:(108/137): python3-grpcio-tools-1.46.7-10.el9.x 6.7 MB/s | 144 kB 00:00 2026-03-08T22:38:00.346 INFO:teuthology.orchestra.run.vm10.stdout:(109/137): python3-jaraco-8.2.1-3.el9.noarch.rp 2.7 MB/s | 11 kB 00:00 2026-03-08T22:38:00.351 INFO:teuthology.orchestra.run.vm10.stdout:(110/137): python3-grpcio-1.46.7-10.el9.x86_64. 51 MB/s | 2.0 MB 00:00 2026-03-08T22:38:00.352 INFO:teuthology.orchestra.run.vm10.stdout:(111/137): python3-jaraco-classes-3.2.1-5.el9.n 3.0 MB/s | 18 kB 00:00 2026-03-08T22:38:00.354 INFO:teuthology.orchestra.run.vm10.stdout:(112/137): python3-jaraco-collections-3.0.0-8.e 9.8 MB/s | 23 kB 00:00 2026-03-08T22:38:00.354 INFO:teuthology.orchestra.run.vm10.stdout:(113/137): python3-jaraco-context-6.0.1-3.el9.n 7.9 MB/s | 20 kB 00:00 2026-03-08T22:38:00.356 INFO:teuthology.orchestra.run.vm10.stdout:(114/137): python3-jaraco-functools-3.5.0-2.el9 8.3 MB/s | 19 kB 00:00 2026-03-08T22:38:00.357 INFO:teuthology.orchestra.run.vm10.stdout:(115/137): python3-jaraco-text-4.0.0-2.el9.noar 11 MB/s | 26 kB 00:00 2026-03-08T22:38:00.362 INFO:teuthology.orchestra.run.vm10.stdout:(116/137): python3-logutils-0.3.5-21.el9.noarch 8.9 MB/s | 46 kB 00:00 2026-03-08T22:38:00.371 INFO:teuthology.orchestra.run.vm10.stdout:(117/137): python3-kubernetes-26.1.0-3.el9.noar 71 MB/s | 1.0 MB 00:00 2026-03-08T22:38:00.372 INFO:teuthology.orchestra.run.vm10.stdout:(118/137): python3-more-itertools-8.12.0-2.el9. 8.3 MB/s | 79 kB 00:00 2026-03-08T22:38:00.374 INFO:teuthology.orchestra.run.vm10.stdout:(119/137): python3-natsort-7.1.1-5.el9.noarch.r 22 MB/s | 58 kB 00:00 2026-03-08T22:38:00.376 INFO:teuthology.orchestra.run.vm10.stdout:(120/137): python3-portend-3.1.0-2.el9.noarch.r 7.6 MB/s | 16 kB 00:00 2026-03-08T22:38:00.379 INFO:teuthology.orchestra.run.vm10.stdout:(121/137): python3-pyOpenSSL-21.0.0-1.el9.noarc 28 MB/s | 90 kB 00:00 2026-03-08T22:38:00.381 INFO:teuthology.orchestra.run.vm10.stdout:(122/137): python3-pecan-1.4.2-3.el9.noarch.rpm 30 MB/s | 272 kB 00:00 2026-03-08T22:38:00.382 INFO:teuthology.orchestra.run.vm10.stdout:(123/137): python3-repoze-lru-0.7-16.el9.noarch 13 MB/s | 31 kB 00:00 2026-03-08T22:38:00.386 INFO:teuthology.orchestra.run.vm10.stdout:(124/137): python3-rsa-4.9-2.el9.noarch.rpm 16 MB/s | 59 kB 00:00 2026-03-08T22:38:00.389 INFO:teuthology.orchestra.run.vm10.stdout:(125/137): python3-tempora-5.0.0-2.el9.noarch.r 14 MB/s | 36 kB 00:00 2026-03-08T22:38:00.392 INFO:teuthology.orchestra.run.vm10.stdout:(126/137): python3-routes-2.5.1-5.el9.noarch.rp 16 MB/s | 188 kB 00:00 2026-03-08T22:38:00.393 INFO:teuthology.orchestra.run.vm10.stdout:(127/137): python3-typing-extensions-4.15.0-1.e 19 MB/s | 86 kB 00:00 2026-03-08T22:38:00.397 INFO:teuthology.orchestra.run.vm10.stdout:(128/137): python3-websocket-client-1.2.3-2.el9 27 MB/s | 90 kB 00:00 2026-03-08T22:38:00.403 INFO:teuthology.orchestra.run.vm10.stdout:(129/137): python3-webob-1.8.8-2.el9.noarch.rpm 22 MB/s | 230 kB 00:00 2026-03-08T22:38:00.405 INFO:teuthology.orchestra.run.vm10.stdout:(130/137): python3-werkzeug-2.0.3-3.el9.1.noarc 55 MB/s | 427 kB 00:00 2026-03-08T22:38:00.406 INFO:teuthology.orchestra.run.vm10.stdout:(131/137): python3-xmltodict-0.12.0-15.el9.noar 6.8 MB/s | 22 kB 00:00 2026-03-08T22:38:00.407 INFO:teuthology.orchestra.run.vm10.stdout:(132/137): python3-zc-lockfile-2.0-10.el9.noarc 9.1 MB/s | 20 kB 00:00 2026-03-08T22:38:00.416 INFO:teuthology.orchestra.run.vm10.stdout:(133/137): re2-20211101-20.el9.x86_64.rpm 20 MB/s | 191 kB 00:00 2026-03-08T22:38:00.429 INFO:teuthology.orchestra.run.vm10.stdout:(134/137): thrift-0.15.0-4.el9.x86_64.rpm 75 MB/s | 1.6 MB 00:00 2026-03-08T22:38:01.392 INFO:teuthology.orchestra.run.vm10.stdout:(135/137): librbd1-19.2.3-678.ge911bdeb.el9.x86 3.3 MB/s | 3.2 MB 00:00 2026-03-08T22:38:01.567 INFO:teuthology.orchestra.run.vm10.stdout:(136/137): librados2-19.2.3-678.ge911bdeb.el9.x 3.0 MB/s | 3.4 MB 00:01 2026-03-08T22:43:06.881 INFO:teuthology.orchestra.run.vm10.stdout:[MIRROR] protobuf-compiler-3.14.0-17.el9.x86_64.rpm: Curl error (28): Timeout was reached for http://ftp.nsc.ru/pub/centos-9/9-stream/CRB/x86_64/os/Packages/protobuf-compiler-3.14.0-17.el9.x86_64.rpm [Operation too slow. Less than 1000 bytes/sec transferred the last 300 seconds] 2026-03-08T22:48:13.886 INFO:teuthology.orchestra.run.vm10.stdout:[MIRROR] protobuf-compiler-3.14.0-17.el9.x86_64.rpm: Curl error (28): Timeout was reached for http://ftp.nsc.ru/pub/centos-9/9-stream/CRB/x86_64/os/Packages/protobuf-compiler-3.14.0-17.el9.x86_64.rpm [Operation too slow. Less than 1000 bytes/sec transferred the last 300 seconds] 2026-03-08T22:53:20.892 INFO:teuthology.orchestra.run.vm10.stdout:[MIRROR] protobuf-compiler-3.14.0-17.el9.x86_64.rpm: Curl error (28): Timeout was reached for https://ftp.nsc.ru/pub/centos-9/9-stream/CRB/x86_64/os/Packages/protobuf-compiler-3.14.0-17.el9.x86_64.rpm [Operation too slow. Less than 1000 bytes/sec transferred the last 300 seconds] 2026-03-08T22:53:21.604 INFO:teuthology.orchestra.run.vm10.stdout:(137/137): protobuf-compiler-3.14.0-17.el9.x86_ 957 B/s | 862 kB 15:21 2026-03-08T22:53:21.605 INFO:teuthology.orchestra.run.vm10.stdout:-------------------------------------------------------------------------------- 2026-03-08T22:53:21.605 INFO:teuthology.orchestra.run.vm10.stdout:Total 230 kB/s | 210 MB 15:35 2026-03-08T22:53:22.135 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction check 2026-03-08T22:53:22.187 INFO:teuthology.orchestra.run.vm10.stdout:Transaction check succeeded. 2026-03-08T22:53:22.187 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction test 2026-03-08T22:53:23.043 INFO:teuthology.orchestra.run.vm10.stdout:Transaction test succeeded. 2026-03-08T22:53:23.043 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction 2026-03-08T22:53:23.976 INFO:teuthology.orchestra.run.vm10.stdout: Preparing : 1/1 2026-03-08T22:53:24.010 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 1/139 2026-03-08T22:53:24.023 INFO:teuthology.orchestra.run.vm10.stdout: Installing : thrift-0.15.0-4.el9.x86_64 2/139 2026-03-08T22:53:24.208 INFO:teuthology.orchestra.run.vm10.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/139 2026-03-08T22:53:24.211 INFO:teuthology.orchestra.run.vm10.stdout: Upgrading : librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 4/139 2026-03-08T22:53:24.274 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 4/139 2026-03-08T22:53:24.277 INFO:teuthology.orchestra.run.vm10.stdout: Installing : libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 5/139 2026-03-08T22:53:24.306 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 5/139 2026-03-08T22:53:24.315 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-rados-2:19.2.3-678.ge911bdeb.el9.x86_64 6/139 2026-03-08T22:53:24.367 INFO:teuthology.orchestra.run.vm10.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/139 2026-03-08T22:53:24.371 INFO:teuthology.orchestra.run.vm10.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/139 2026-03-08T22:53:24.377 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/139 2026-03-08T22:53:24.389 INFO:teuthology.orchestra.run.vm10.stdout: Installing : libnbd-1.20.3-4.el9.x86_64 10/139 2026-03-08T22:53:24.390 INFO:teuthology.orchestra.run.vm10.stdout: Installing : libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 11/139 2026-03-08T22:53:24.427 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 11/139 2026-03-08T22:53:24.430 INFO:teuthology.orchestra.run.vm10.stdout: Installing : libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_ 12/139 2026-03-08T22:53:24.443 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_ 12/139 2026-03-08T22:53:24.485 INFO:teuthology.orchestra.run.vm10.stdout: Installing : re2-1:20211101-20.el9.x86_64 13/139 2026-03-08T22:53:24.529 INFO:teuthology.orchestra.run.vm10.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 14/139 2026-03-08T22:53:24.536 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 15/139 2026-03-08T22:53:24.562 INFO:teuthology.orchestra.run.vm10.stdout: Installing : liboath-2.6.12-1.el9.x86_64 16/139 2026-03-08T22:53:24.572 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 17/139 2026-03-08T22:53:24.584 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 18/139 2026-03-08T22:53:24.591 INFO:teuthology.orchestra.run.vm10.stdout: Installing : protobuf-3.14.0-17.el9.x86_64 19/139 2026-03-08T22:53:24.596 INFO:teuthology.orchestra.run.vm10.stdout: Installing : lua-5.4.4-4.el9.x86_64 20/139 2026-03-08T22:53:24.602 INFO:teuthology.orchestra.run.vm10.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 21/139 2026-03-08T22:53:24.641 INFO:teuthology.orchestra.run.vm10.stdout: Installing : unzip-6.0-59.el9.x86_64 22/139 2026-03-08T22:53:24.663 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 23/139 2026-03-08T22:53:24.670 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 24/139 2026-03-08T22:53:24.678 INFO:teuthology.orchestra.run.vm10.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 25/139 2026-03-08T22:53:24.736 INFO:teuthology.orchestra.run.vm10.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 26/139 2026-03-08T22:53:24.767 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 27/139 2026-03-08T22:53:24.774 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-ceph-common-2:19.2.3-678.ge911bdeb.el9.x 28/139 2026-03-08T22:53:24.785 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-ceph-argparse-2:19.2.3-678.ge911bdeb.el9 29/139 2026-03-08T22:53:24.799 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-cephfs-2:19.2.3-678.ge911bdeb.el9.x86_64 30/139 2026-03-08T22:53:24.812 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 31/139 2026-03-08T22:53:24.844 INFO:teuthology.orchestra.run.vm10.stdout: Installing : zip-3.0-35.el9.x86_64 32/139 2026-03-08T22:53:24.850 INFO:teuthology.orchestra.run.vm10.stdout: Installing : luarocks-3.9.2-5.el9.noarch 33/139 2026-03-08T22:53:24.860 INFO:teuthology.orchestra.run.vm10.stdout: Installing : lua-devel-5.4.4-4.el9.x86_64 34/139 2026-03-08T22:53:24.891 INFO:teuthology.orchestra.run.vm10.stdout: Installing : protobuf-compiler-3.14.0-17.el9.x86_64 35/139 2026-03-08T22:53:24.959 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 36/139 2026-03-08T22:53:24.977 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 37/139 2026-03-08T22:53:24.987 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-rsa-4.9-2.el9.noarch 38/139 2026-03-08T22:53:25.001 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 39/139 2026-03-08T22:53:25.011 INFO:teuthology.orchestra.run.vm10.stdout: Installing : librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 40/139 2026-03-08T22:53:25.018 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 41/139 2026-03-08T22:53:25.040 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 42/139 2026-03-08T22:53:25.070 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 43/139 2026-03-08T22:53:25.079 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 44/139 2026-03-08T22:53:25.086 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 45/139 2026-03-08T22:53:25.102 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 46/139 2026-03-08T22:53:25.117 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 47/139 2026-03-08T22:53:25.151 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 48/139 2026-03-08T22:53:25.220 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 49/139 2026-03-08T22:53:25.230 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 50/139 2026-03-08T22:53:25.240 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 51/139 2026-03-08T22:53:25.291 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 52/139 2026-03-08T22:53:25.685 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 53/139 2026-03-08T22:53:25.702 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 54/139 2026-03-08T22:53:25.860 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 55/139 2026-03-08T22:53:25.902 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 56/139 2026-03-08T22:53:25.992 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 57/139 2026-03-08T22:53:26.000 INFO:teuthology.orchestra.run.vm10.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 58/139 2026-03-08T22:53:26.005 INFO:teuthology.orchestra.run.vm10.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 59/139 2026-03-08T22:53:26.007 INFO:teuthology.orchestra.run.vm10.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 60/139 2026-03-08T22:53:26.039 INFO:teuthology.orchestra.run.vm10.stdout: Installing : grpc-data-1.46.7-10.el9.noarch 61/139 2026-03-08T22:53:26.095 INFO:teuthology.orchestra.run.vm10.stdout: Installing : abseil-cpp-20211102.0-4.el9.x86_64 62/139 2026-03-08T22:53:26.111 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-grpcio-1.46.7-10.el9.x86_64 63/139 2026-03-08T22:53:26.120 INFO:teuthology.orchestra.run.vm10.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 64/139 2026-03-08T22:53:26.128 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 65/139 2026-03-08T22:53:26.136 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 66/139 2026-03-08T22:53:26.144 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 67/139 2026-03-08T22:53:26.207 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 68/139 2026-03-08T22:53:26.214 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 69/139 2026-03-08T22:53:26.249 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 70/139 2026-03-08T22:53:26.271 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-protobuf-3.14.0-17.el9.noarch 71/139 2026-03-08T22:53:26.316 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-grpcio-tools-1.46.7-10.el9.x86_64 72/139 2026-03-08T22:53:26.626 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 73/139 2026-03-08T22:53:26.659 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 74/139 2026-03-08T22:53:26.666 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 75/139 2026-03-08T22:53:26.731 INFO:teuthology.orchestra.run.vm10.stdout: Installing : openblas-0.3.29-1.el9.x86_64 76/139 2026-03-08T22:53:26.749 INFO:teuthology.orchestra.run.vm10.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 77/139 2026-03-08T22:53:26.774 INFO:teuthology.orchestra.run.vm10.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 78/139 2026-03-08T22:53:27.185 INFO:teuthology.orchestra.run.vm10.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 79/139 2026-03-08T22:53:27.320 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 80/139 2026-03-08T22:53:28.194 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 81/139 2026-03-08T22:53:28.284 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 82/139 2026-03-08T22:53:28.290 INFO:teuthology.orchestra.run.vm10.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 83/139 2026-03-08T22:53:28.295 INFO:teuthology.orchestra.run.vm10.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 84/139 2026-03-08T22:53:28.463 INFO:teuthology.orchestra.run.vm10.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 85/139 2026-03-08T22:53:28.467 INFO:teuthology.orchestra.run.vm10.stdout: Upgrading : librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 86/139 2026-03-08T22:53:28.500 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 86/139 2026-03-08T22:53:28.504 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-rbd-2:19.2.3-678.ge911bdeb.el9.x86_64 87/139 2026-03-08T22:53:28.517 INFO:teuthology.orchestra.run.vm10.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 88/139 2026-03-08T22:53:28.792 INFO:teuthology.orchestra.run.vm10.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 89/139 2026-03-08T22:53:28.795 INFO:teuthology.orchestra.run.vm10.stdout: Installing : librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 90/139 2026-03-08T22:53:28.814 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 90/139 2026-03-08T22:53:28.818 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-rgw-2:19.2.3-678.ge911bdeb.el9.x86_64 91/139 2026-03-08T22:53:30.022 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 92/139 2026-03-08T22:53:30.036 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 92/139 2026-03-08T22:53:30.055 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 92/139 2026-03-08T22:53:30.074 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-pyparsing-2.4.7-9.el9.noarch 93/139 2026-03-08T22:53:30.084 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-packaging-20.9-5.el9.noarch 94/139 2026-03-08T22:53:30.108 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-ply-3.11-14.el9.noarch 95/139 2026-03-08T22:53:30.133 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 96/139 2026-03-08T22:53:30.231 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 97/139 2026-03-08T22:53:30.261 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 98/139 2026-03-08T22:53:30.298 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 99/139 2026-03-08T22:53:30.348 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 100/139 2026-03-08T22:53:30.421 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 101/139 2026-03-08T22:53:30.444 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 102/139 2026-03-08T22:53:30.458 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 103/139 2026-03-08T22:53:30.467 INFO:teuthology.orchestra.run.vm10.stdout: Installing : pciutils-3.7.0-7.el9.x86_64 104/139 2026-03-08T22:53:30.473 INFO:teuthology.orchestra.run.vm10.stdout: Installing : qatlib-25.08.0-2.el9.x86_64 105/139 2026-03-08T22:53:30.478 INFO:teuthology.orchestra.run.vm10.stdout: Installing : qatlib-service-25.08.0-2.el9.x86_64 106/139 2026-03-08T22:53:30.499 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: qatlib-service-25.08.0-2.el9.x86_64 106/139 2026-03-08T22:53:30.834 INFO:teuthology.orchestra.run.vm10.stdout: Installing : qatzip-libs-1.3.1-1.el9.x86_64 107/139 2026-03-08T22:53:30.840 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 108/139 2026-03-08T22:53:30.884 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 108/139 2026-03-08T22:53:30.884 INFO:teuthology.orchestra.run.vm10.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-08T22:53:30.884 INFO:teuthology.orchestra.run.vm10.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-08T22:53:30.884 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:53:30.889 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 109/139 2026-03-08T22:53:37.945 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 109/139 2026-03-08T22:53:37.945 INFO:teuthology.orchestra.run.vm10.stdout:skipping the directory /sys 2026-03-08T22:53:37.945 INFO:teuthology.orchestra.run.vm10.stdout:skipping the directory /proc 2026-03-08T22:53:37.945 INFO:teuthology.orchestra.run.vm10.stdout:skipping the directory /mnt 2026-03-08T22:53:37.945 INFO:teuthology.orchestra.run.vm10.stdout:skipping the directory /var/tmp 2026-03-08T22:53:37.945 INFO:teuthology.orchestra.run.vm10.stdout:skipping the directory /home 2026-03-08T22:53:37.945 INFO:teuthology.orchestra.run.vm10.stdout:skipping the directory /root 2026-03-08T22:53:37.945 INFO:teuthology.orchestra.run.vm10.stdout:skipping the directory /tmp 2026-03-08T22:53:37.946 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:53:38.076 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 110/139 2026-03-08T22:53:38.102 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 110/139 2026-03-08T22:53:38.102 INFO:teuthology.orchestra.run.vm10.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:53:38.102 INFO:teuthology.orchestra.run.vm10.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-08T22:53:38.102 INFO:teuthology.orchestra.run.vm10.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-08T22:53:38.102 INFO:teuthology.orchestra.run.vm10.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-08T22:53:38.102 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:53:38.520 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 111/139 2026-03-08T22:53:38.546 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 111/139 2026-03-08T22:53:38.547 INFO:teuthology.orchestra.run.vm10.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:53:38.547 INFO:teuthology.orchestra.run.vm10.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-08T22:53:38.547 INFO:teuthology.orchestra.run.vm10.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-08T22:53:38.547 INFO:teuthology.orchestra.run.vm10.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-08T22:53:38.547 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:53:38.693 INFO:teuthology.orchestra.run.vm10.stdout: Installing : mailcap-2.1.49-5.el9.noarch 112/139 2026-03-08T22:53:38.696 INFO:teuthology.orchestra.run.vm10.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 113/139 2026-03-08T22:53:38.715 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 114/139 2026-03-08T22:53:38.715 INFO:teuthology.orchestra.run.vm10.stdout:Creating group 'qat' with GID 994. 2026-03-08T22:53:38.715 INFO:teuthology.orchestra.run.vm10.stdout:Creating group 'libstoragemgmt' with GID 993. 2026-03-08T22:53:38.715 INFO:teuthology.orchestra.run.vm10.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 993 and GID 993. 2026-03-08T22:53:38.715 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:53:38.726 INFO:teuthology.orchestra.run.vm10.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 114/139 2026-03-08T22:53:38.757 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 114/139 2026-03-08T22:53:38.757 INFO:teuthology.orchestra.run.vm10.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-08T22:53:38.757 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:53:38.803 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 115/139 2026-03-08T22:53:38.886 INFO:teuthology.orchestra.run.vm10.stdout: Installing : cryptsetup-2.8.1-3.el9.x86_64 116/139 2026-03-08T22:53:38.890 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 117/139 2026-03-08T22:53:38.906 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 117/139 2026-03-08T22:53:38.906 INFO:teuthology.orchestra.run.vm10.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:53:38.906 INFO:teuthology.orchestra.run.vm10.stdout:Invalid unit name "ceph-volume@*.service" escaped as "ceph-volume@\x2a.service". 2026-03-08T22:53:38.906 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:53:39.794 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 118/139 2026-03-08T22:53:39.829 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 118/139 2026-03-08T22:53:39.829 INFO:teuthology.orchestra.run.vm10.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:53:39.829 INFO:teuthology.orchestra.run.vm10.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-08T22:53:39.829 INFO:teuthology.orchestra.run.vm10.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-08T22:53:39.829 INFO:teuthology.orchestra.run.vm10.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-08T22:53:39.829 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:53:39.908 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 119/139 2026-03-08T22:53:39.912 INFO:teuthology.orchestra.run.vm10.stdout: Installing : cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 119/139 2026-03-08T22:53:39.922 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ceph-prometheus-alerts-2:19.2.3-678.ge911bdeb.el 120/139 2026-03-08T22:53:39.947 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ceph-grafana-dashboards-2:19.2.3-678.ge911bdeb.e 121/139 2026-03-08T22:53:39.951 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noar 122/139 2026-03-08T22:53:40.553 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noar 122/139 2026-03-08T22:53:40.561 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.no 123/139 2026-03-08T22:53:41.193 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.no 123/139 2026-03-08T22:53:41.195 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ceph-mgr-diskprediction-local-2:19.2.3-678.ge911 124/139 2026-03-08T22:53:41.268 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:19.2.3-678.ge911 124/139 2026-03-08T22:53:41.339 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ceph-mgr-modules-core-2:19.2.3-678.ge911bdeb.el9 125/139 2026-03-08T22:53:41.348 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 126/139 2026-03-08T22:53:41.372 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 126/139 2026-03-08T22:53:41.372 INFO:teuthology.orchestra.run.vm10.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:53:41.372 INFO:teuthology.orchestra.run.vm10.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-08T22:53:41.372 INFO:teuthology.orchestra.run.vm10.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-08T22:53:41.372 INFO:teuthology.orchestra.run.vm10.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-08T22:53:41.372 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:53:41.388 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 127/139 2026-03-08T22:53:41.398 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 127/139 2026-03-08T22:53:41.966 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ceph-2:19.2.3-678.ge911bdeb.el9.x86_64 128/139 2026-03-08T22:53:41.974 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 129/139 2026-03-08T22:53:42.001 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 129/139 2026-03-08T22:53:42.001 INFO:teuthology.orchestra.run.vm10.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:53:42.001 INFO:teuthology.orchestra.run.vm10.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-08T22:53:42.001 INFO:teuthology.orchestra.run.vm10.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-08T22:53:42.001 INFO:teuthology.orchestra.run.vm10.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-08T22:53:42.001 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:53:42.015 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ceph-immutable-object-cache-2:19.2.3-678.ge911bd 130/139 2026-03-08T22:53:42.037 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-immutable-object-cache-2:19.2.3-678.ge911bd 130/139 2026-03-08T22:53:42.037 INFO:teuthology.orchestra.run.vm10.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:53:42.037 INFO:teuthology.orchestra.run.vm10.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-08T22:53:42.037 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:53:42.203 INFO:teuthology.orchestra.run.vm10.stdout: Installing : rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 131/139 2026-03-08T22:53:42.226 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 131/139 2026-03-08T22:53:42.226 INFO:teuthology.orchestra.run.vm10.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:53:42.226 INFO:teuthology.orchestra.run.vm10.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-08T22:53:42.226 INFO:teuthology.orchestra.run.vm10.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-08T22:53:42.226 INFO:teuthology.orchestra.run.vm10.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-08T22:53:42.226 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:53:45.043 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ceph-test-2:19.2.3-678.ge911bdeb.el9.x86_64 132/139 2026-03-08T22:53:45.063 INFO:teuthology.orchestra.run.vm10.stdout: Installing : rbd-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 133/139 2026-03-08T22:53:45.073 INFO:teuthology.orchestra.run.vm10.stdout: Installing : rbd-nbd-2:19.2.3-678.ge911bdeb.el9.x86_64 134/139 2026-03-08T22:53:45.136 INFO:teuthology.orchestra.run.vm10.stdout: Installing : libcephfs-devel-2:19.2.3-678.ge911bdeb.el9.x86_6 135/139 2026-03-08T22:53:45.148 INFO:teuthology.orchestra.run.vm10.stdout: Installing : ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 136/139 2026-03-08T22:53:45.157 INFO:teuthology.orchestra.run.vm10.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 137/139 2026-03-08T22:53:45.158 INFO:teuthology.orchestra.run.vm10.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 138/139 2026-03-08T22:53:45.178 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 138/139 2026-03-08T22:53:45.178 INFO:teuthology.orchestra.run.vm10.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 139/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 139/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-2:19.2.3-678.ge911bdeb.el9.x86_64 1/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 2/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 3/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 4/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-immutable-object-cache-2:19.2.3-678.ge911bd 5/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 6/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 7/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 8/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 9/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 10/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 11/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-test-2:19.2.3-678.ge911bdeb.el9.x86_64 12/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libcephfs-devel-2:19.2.3-678.ge911bdeb.el9.x86_6 13/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 14/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 15/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 16/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_ 17/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 18/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-ceph-argparse-2:19.2.3-678.ge911bdeb.el9 19/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-ceph-common-2:19.2.3-678.ge911bdeb.el9.x 20/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-cephfs-2:19.2.3-678.ge911bdeb.el9.x86_64 21/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-rados-2:19.2.3-678.ge911bdeb.el9.x86_64 22/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-rbd-2:19.2.3-678.ge911bdeb.el9.x86_64 23/139 2026-03-08T22:53:46.670 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-rgw-2:19.2.3-678.ge911bdeb.el9.x86_64 24/139 2026-03-08T22:53:46.671 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : rbd-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 25/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 26/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : rbd-nbd-2:19.2.3-678.ge911bdeb.el9.x86_64 27/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-grafana-dashboards-2:19.2.3-678.ge911bdeb.e 28/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noar 29/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.no 30/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-mgr-diskprediction-local-2:19.2.3-678.ge911 31/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-mgr-modules-core-2:19.2.3-678.ge911bdeb.el9 32/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 33/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-prometheus-alerts-2:19.2.3-678.ge911bdeb.el 34/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 35/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 36/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : cryptsetup-2.8.1-3.el9.x86_64 37/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 38/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 39/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 40/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 41/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 42/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : pciutils-3.7.0-7.el9.x86_64 43/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 44/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 45/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-ply-3.11-14.el9.noarch 46/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 47/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-pyparsing-2.4.7-9.el9.noarch 48/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 49/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 50/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : unzip-6.0-59.el9.x86_64 51/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : zip-3.0-35.el9.x86_64 52/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 53/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 54/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 55/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 56/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libnbd-1.20.3-4.el9.x86_64 57/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 58/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 59/139 2026-03-08T22:53:46.673 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 60/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 61/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 62/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 63/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : lua-5.4.4-4.el9.x86_64 64/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 65/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 66/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : protobuf-3.14.0-17.el9.x86_64 67/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 68/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 69/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 70/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 71/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 72/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 73/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 74/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 75/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 76/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-packaging-20.9-5.el9.noarch 77/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-protobuf-3.14.0-17.el9.noarch 78/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 79/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 80/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 81/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 82/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 83/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : qatlib-25.08.0-2.el9.x86_64 84/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : qatlib-service-25.08.0-2.el9.x86_64 85/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : qatzip-libs-1.3.1-1.el9.x86_64 86/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 87/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 88/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : lua-devel-5.4.4-4.el9.x86_64 89/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : protobuf-compiler-3.14.0-17.el9.x86_64 90/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : abseil-cpp-20211102.0-4.el9.x86_64 91/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 92/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : grpc-data-1.46.7-10.el9.noarch 93/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 94/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 95/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 96/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 97/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : luarocks-3.9.2-5.el9.noarch 98/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 99/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 100/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 101/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 102/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 103/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 104/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 105/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 106/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 107/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 108/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-grpcio-1.46.7-10.el9.x86_64 109/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-grpcio-tools-1.46.7-10.el9.x86_64 110/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 111/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 112/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 113/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 114/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 115/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 116/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 117/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 118/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 119/139 2026-03-08T22:53:46.674 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 120/139 2026-03-08T22:53:46.675 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 121/139 2026-03-08T22:53:46.675 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 122/139 2026-03-08T22:53:46.675 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 123/139 2026-03-08T22:53:46.675 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 124/139 2026-03-08T22:53:46.675 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 125/139 2026-03-08T22:53:46.675 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 126/139 2026-03-08T22:53:46.675 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 127/139 2026-03-08T22:53:46.675 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 128/139 2026-03-08T22:53:46.675 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 129/139 2026-03-08T22:53:46.675 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 130/139 2026-03-08T22:53:46.675 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 131/139 2026-03-08T22:53:46.675 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 132/139 2026-03-08T22:53:46.675 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 133/139 2026-03-08T22:53:46.675 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : re2-1:20211101-20.el9.x86_64 134/139 2026-03-08T22:53:46.675 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 135/139 2026-03-08T22:53:46.675 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 136/139 2026-03-08T22:53:46.675 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 137/139 2026-03-08T22:53:46.675 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 138/139 2026-03-08T22:53:46.783 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 139/139 2026-03-08T22:53:46.783 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:53:46.783 INFO:teuthology.orchestra.run.vm10.stdout:Upgraded: 2026-03-08T22:53:46.783 INFO:teuthology.orchestra.run.vm10.stdout: librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.783 INFO:teuthology.orchestra.run.vm10.stdout: librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.783 INFO:teuthology.orchestra.run.vm10.stdout:Installed: 2026-03-08T22:53:46.783 INFO:teuthology.orchestra.run.vm10.stdout: abseil-cpp-20211102.0-4.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ceph-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ceph-grafana-dashboards-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ceph-immutable-object-cache-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-diskprediction-local-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-modules-core-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ceph-prometheus-alerts-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ceph-test-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: cryptsetup-2.8.1-3.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: grpc-data-1.46.7-10.el9.noarch 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: libcephfs-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: libnbd-1.20.3-4.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: lua-5.4.4-4.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: lua-devel-5.4.4-4.el9.x86_64 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: luarocks-3.9.2-5.el9.noarch 2026-03-08T22:53:46.784 INFO:teuthology.orchestra.run.vm10.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: pciutils-3.7.0-7.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: protobuf-3.14.0-17.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: protobuf-compiler-3.14.0-17.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-ceph-argparse-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-cephfs-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-grpcio-1.46.7-10.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-grpcio-tools-1.46.7-10.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-packaging-20.9-5.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-ply-3.11-14.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-protobuf-3.14.0-17.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-pyparsing-2.4.7-9.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-rados-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-rbd-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-rgw-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-08T22:53:46.785 INFO:teuthology.orchestra.run.vm10.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: qatlib-25.08.0-2.el9.x86_64 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: qatlib-service-25.08.0-2.el9.x86_64 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: qatzip-libs-1.3.1-1.el9.x86_64 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: rbd-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: rbd-nbd-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: re2-1:20211101-20.el9.x86_64 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: unzip-6.0-59.el9.x86_64 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: zip-3.0-35.el9.x86_64 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:53:46.786 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:53:46.890 DEBUG:teuthology.parallel:result is None 2026-03-08T22:53:46.890 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T22:53:47.595 DEBUG:teuthology.orchestra.run.vm10:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-08T22:53:47.617 INFO:teuthology.orchestra.run.vm10.stdout:19.2.3-678.ge911bdeb.el9 2026-03-08T22:53:47.618 INFO:teuthology.packaging:The installed version of ceph is 19.2.3-678.ge911bdeb.el9 2026-03-08T22:53:47.618 INFO:teuthology.task.install:The correct ceph version 19.2.3-678.ge911bdeb is installed. 2026-03-08T22:53:47.619 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-08T22:53:47.619 DEBUG:teuthology.orchestra.run.vm10:> set -ex 2026-03-08T22:53:47.619 DEBUG:teuthology.orchestra.run.vm10:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-08T22:53:47.697 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-08T22:53:47.697 DEBUG:teuthology.orchestra.run.vm10:> set -ex 2026-03-08T22:53:47.697 DEBUG:teuthology.orchestra.run.vm10:> sudo dd of=/usr/bin/daemon-helper 2026-03-08T22:53:47.767 DEBUG:teuthology.orchestra.run.vm10:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-08T22:53:47.838 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-08T22:53:47.838 DEBUG:teuthology.orchestra.run.vm10:> set -ex 2026-03-08T22:53:47.838 DEBUG:teuthology.orchestra.run.vm10:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-08T22:53:47.905 DEBUG:teuthology.orchestra.run.vm10:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-08T22:53:47.969 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-08T22:53:47.969 DEBUG:teuthology.orchestra.run.vm10:> set -ex 2026-03-08T22:53:47.969 DEBUG:teuthology.orchestra.run.vm10:> sudo dd of=/usr/bin/stdin-killer 2026-03-08T22:53:48.035 DEBUG:teuthology.orchestra.run.vm10:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-08T22:53:48.101 INFO:teuthology.run_tasks:Running task workunit... 2026-03-08T22:53:48.105 INFO:tasks.workunit:Pulling workunits from ref 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-08T22:53:48.105 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-08T22:53:48.105 INFO:tasks.workunit:timeout=3h 2026-03-08T22:53:48.105 INFO:tasks.workunit:cleanup=True 2026-03-08T22:53:48.105 DEBUG:teuthology.orchestra.run.vm10:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-08T22:53:48.160 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:53:48.160 INFO:teuthology.orchestra.run.vm10.stderr:stat: cannot statx '/home/ubuntu/cephtest/mnt.0': No such file or directory 2026-03-08T22:53:48.160 DEBUG:teuthology.orchestra.run.vm10:> mkdir -- /home/ubuntu/cephtest/mnt.0 2026-03-08T22:53:48.222 INFO:tasks.workunit:Created dir /home/ubuntu/cephtest/mnt.0 2026-03-08T22:53:48.223 DEBUG:teuthology.orchestra.run.vm10:> cd -- /home/ubuntu/cephtest/mnt.0 && mkdir -- client.0 2026-03-08T22:53:48.283 DEBUG:teuthology.orchestra.run.vm10:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-08T22:53:48.345 INFO:tasks.workunit.client.0.vm10.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-08T22:54:32.692 INFO:tasks.workunit.client.0.vm10.stderr:Note: switching to '569c3e99c9b32a51b4eaf08731c728f4513ed589'. 2026-03-08T22:54:32.692 INFO:tasks.workunit.client.0.vm10.stderr: 2026-03-08T22:54:32.692 INFO:tasks.workunit.client.0.vm10.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-08T22:54:32.692 INFO:tasks.workunit.client.0.vm10.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-08T22:54:32.692 INFO:tasks.workunit.client.0.vm10.stderr:state without impacting any branches by switching back to a branch. 2026-03-08T22:54:32.692 INFO:tasks.workunit.client.0.vm10.stderr: 2026-03-08T22:54:32.692 INFO:tasks.workunit.client.0.vm10.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-08T22:54:32.692 INFO:tasks.workunit.client.0.vm10.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-08T22:54:32.692 INFO:tasks.workunit.client.0.vm10.stderr: 2026-03-08T22:54:32.692 INFO:tasks.workunit.client.0.vm10.stderr: git switch -c 2026-03-08T22:54:32.692 INFO:tasks.workunit.client.0.vm10.stderr: 2026-03-08T22:54:32.692 INFO:tasks.workunit.client.0.vm10.stderr:Or undo this operation with: 2026-03-08T22:54:32.693 INFO:tasks.workunit.client.0.vm10.stderr: 2026-03-08T22:54:32.693 INFO:tasks.workunit.client.0.vm10.stderr: git switch - 2026-03-08T22:54:32.693 INFO:tasks.workunit.client.0.vm10.stderr: 2026-03-08T22:54:32.693 INFO:tasks.workunit.client.0.vm10.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-08T22:54:32.693 INFO:tasks.workunit.client.0.vm10.stderr: 2026-03-08T22:54:32.693 INFO:tasks.workunit.client.0.vm10.stderr:HEAD is now at 569c3e99c9b qa/rgw: bucket notifications use pynose 2026-03-08T22:54:32.699 DEBUG:teuthology.orchestra.run.vm10:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/standalone && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-08T22:54:32.757 DEBUG:teuthology.orchestra.run.vm10:> set -ex 2026-03-08T22:54:32.757 DEBUG:teuthology.orchestra.run.vm10:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-08T22:54:32.815 INFO:tasks.workunit:Running workunits matching crush on client.0... 2026-03-08T22:54:32.815 INFO:tasks.workunit:Running workunit crush/crush-choose-args.sh... 2026-03-08T22:54:32.815 DEBUG:teuthology.orchestra.run.vm10:workunit test crush/crush-choose-args.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh 2026-03-08T22:54:32.879 INFO:tasks.workunit.client.0.vm10.stderr:stty: 'standard input': Inappropriate ioctl for device 2026-03-08T22:54:32.883 INFO:tasks.workunit.client.0.vm10.stderr:+ PS4='${BASH_SOURCE[0]}:$LINENO: ${FUNCNAME[0]}: ' 2026-03-08T22:54:32.883 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: export PATH=.:/home/ubuntu/.local/bin:/home/ubuntu/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/sbin 2026-03-08T22:54:32.883 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: PATH=.:/home/ubuntu/.local/bin:/home/ubuntu/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/sbin 2026-03-08T22:54:32.883 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: export PYTHONWARNINGS=ignore 2026-03-08T22:54:32.883 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: PYTHONWARNINGS=ignore 2026-03-08T22:54:32.883 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: export CEPH_CONF=/dev/null 2026-03-08T22:54:32.883 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: CEPH_CONF=/dev/null 2026-03-08T22:54:32.883 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2373: main: unset CEPH_ARGS 2026-03-08T22:54:32.883 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2375: main: local code 2026-03-08T22:54:32.883 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2376: main: run td/crush-choose-args 2026-03-08T22:54:32.883 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:21: run: local dir=td/crush-choose-args 2026-03-08T22:54:32.883 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:22: run: shift 2026-03-08T22:54:32.883 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:24: run: export CEPH_MON=127.0.0.1:7131 2026-03-08T22:54:32.883 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:24: run: CEPH_MON=127.0.0.1:7131 2026-03-08T22:54:32.883 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:25: run: export CEPH_ARGS 2026-03-08T22:54:32.883 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:26: run: uuidgen 2026-03-08T22:54:32.885 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:26: run: CEPH_ARGS+='--fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none ' 2026-03-08T22:54:32.885 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:27: run: CEPH_ARGS+='--mon-host=127.0.0.1:7131 ' 2026-03-08T22:54:32.885 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:28: run: CEPH_ARGS+='--crush-location=root=default,host=HOST ' 2026-03-08T22:54:32.885 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:29: run: CEPH_ARGS+='--osd-crush-initial-weight=3 ' 2026-03-08T22:54:32.885 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:35: run: CEPH_ARGS+='--osd-class-update-on-start=false ' 2026-03-08T22:54:32.885 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:37: run: set 2026-03-08T22:54:32.885 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:37: run: sed -n -e 's/^\(TEST_[0-9a-z_]*\) .*/\1/p' 2026-03-08T22:54:32.887 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:37: run: local 'funcs=TEST_choose_args_update 2026-03-08T22:54:32.887 INFO:tasks.workunit.client.0.vm10.stderr:TEST_move_bucket 2026-03-08T22:54:32.887 INFO:tasks.workunit.client.0.vm10.stderr:TEST_no_update_weight_set 2026-03-08T22:54:32.887 INFO:tasks.workunit.client.0.vm10.stderr:TEST_reweight' 2026-03-08T22:54:32.887 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:38: run: for func in $funcs 2026-03-08T22:54:32.887 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:39: run: setup td/crush-choose-args 2026-03-08T22:54:32.887 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/crush-choose-args 2026-03-08T22:54:32.887 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/crush-choose-args 2026-03-08T22:54:32.887 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/crush-choose-args 2026-03-08T22:54:32.887 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:54:32.887 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/crush-choose-args KILL 2026-03-08T22:54:32.887 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:54:32.887 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:54:32.887 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:54:32.887 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:54:32.887 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:54:32.889 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:54:32.889 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:54:32.890 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:54:32.890 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:54:32.891 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:54:32.891 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:54:32.891 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:54:32.892 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:54:32.892 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:54:32.892 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:54:32.892 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:54:32.893 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:54:32.894 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:54:32.894 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/crush-choose-args 2026-03-08T22:54:32.895 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:54:32.895 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:32.895 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:54:32.895 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50888 2026-03-08T22:54:32.896 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:54:32.896 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:54:32.896 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/crush-choose-args 2026-03-08T22:54:32.897 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:54:32.898 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:32.898 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:54:32.898 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.50888 2026-03-08T22:54:32.899 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:54:32.899 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 1024 -le 1024 ']' 2026-03-08T22:54:32.900 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:136: setup: ulimit -n 4096 2026-03-08T22:54:32.900 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:54:32.900 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/crush-choose-args 1' TERM HUP INT 2026-03-08T22:54:32.900 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:40: run: TEST_choose_args_update td/crush-choose-args 2026-03-08T22:54:32.900 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:49: TEST_choose_args_update: local dir=td/crush-choose-args 2026-03-08T22:54:32.900 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:51: TEST_choose_args_update: run_mon td/crush-choose-args a 2026-03-08T22:54:32.900 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/crush-choose-args 2026-03-08T22:54:32.900 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:54:32.900 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:54:32.900 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:54:32.900 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/crush-choose-args/a 2026-03-08T22:54:32.900 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/crush-choose-args/a --run-dir=td/crush-choose-args 2026-03-08T22:54:33.087 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:54:33.087 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:33.087 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:33.087 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:33.087 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:33.087 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:54:33.087 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:54:33.087 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/crush-choose-args/a '--log-file=td/crush-choose-args/$name.log' '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --mon-cluster-log-file=td/crush-choose-args/log --run-dir=td/crush-choose-args '--pid-file=td/crush-choose-args/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:54:33.122 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:54:33.123 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:54:33.123 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:54:33.123 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:54:33.123 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:54:33.123 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:54:33.124 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:54:33.124 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:54:33.124 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:54:33.124 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:54:33.124 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:33.124 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:54:33.124 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50888/ceph-mon.a.asok 2026-03-08T22:54:33.124 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:54:33.124 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50888/ceph-mon.a.asok config get fsid 2026-03-08T22:54:33.180 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:54:33.180 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:54:33.180 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:54:33.180 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:54:33.180 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:54:33.181 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:54:33.181 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:54:33.181 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:54:33.181 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:54:33.181 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:33.181 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:54:33.181 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50888/ceph-mon.a.asok 2026-03-08T22:54:33.181 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:54:33.181 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50888/ceph-mon.a.asok config get mon_host 2026-03-08T22:54:33.234 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:52: TEST_choose_args_update: run_mgr td/crush-choose-args x 2026-03-08T22:54:33.234 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/crush-choose-args 2026-03-08T22:54:33.234 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:54:33.234 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:54:33.234 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:54:33.234 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/crush-choose-args/x 2026-03-08T22:54:33.235 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:54:33.363 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:54:33.363 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:33.363 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:33.363 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:33.363 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:33.363 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:54:33.363 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:54:33.363 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:54:33.364 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/crush-choose-args/x '--log-file=td/crush-choose-args/$name.log' '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --run-dir=td/crush-choose-args '--pid-file=td/crush-choose-args/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:54:33.387 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:53: TEST_choose_args_update: run_osd td/crush-choose-args 0 2026-03-08T22:54:33.387 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-choose-args 2026-03-08T22:54:33.387 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:54:33.387 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:54:33.387 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:54:33.387 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-choose-args/0 2026-03-08T22:54:33.387 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false ' 2026-03-08T22:54:33.387 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:54:33.387 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:54:33.387 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:54:33.387 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-choose-args/0' 2026-03-08T22:54:33.387 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-choose-args/0/journal' 2026-03-08T22:54:33.387 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:54:33.387 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:54:33.387 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-choose-args' 2026-03-08T22:54:33.387 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:54:33.387 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:33.387 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:33.388 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:33.388 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:33.388 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:54:33.390 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:54:33.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:54:33.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:54:33.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:54:33.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:54:33.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-choose-args/$name.log' 2026-03-08T22:54:33.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-choose-args/$name.pid' 2026-03-08T22:54:33.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:54:33.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:54:33.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:54:33.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:54:33.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:54:33.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:54:33.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-choose-args/0 2026-03-08T22:54:33.392 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:54:33.393 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=373746ed-55f2-4403-b3a9-cb8f9593f9a7 2026-03-08T22:54:33.393 INFO:tasks.workunit.client.0.vm10.stdout:add osd0 373746ed-55f2-4403-b3a9-cb8f9593f9a7 2026-03-08T22:54:33.393 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 373746ed-55f2-4403-b3a9-cb8f9593f9a7' 2026-03-08T22:54:33.393 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:54:33.407 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAp/q1pR0RFGBAAWP0/yQZ/SqL+2PlbFv82Yg== 2026-03-08T22:54:33.407 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAp/q1pR0RFGBAAWP0/yQZ/SqL+2PlbFv82Yg=="}' 2026-03-08T22:54:33.407 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 373746ed-55f2-4403-b3a9-cb8f9593f9a7 -i td/crush-choose-args/0/new.json 2026-03-08T22:54:33.537 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:54:33.549 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-choose-args/0/new.json 2026-03-08T22:54:33.550 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/0 --osd-journal=td/crush-choose-args/0/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAp/q1pR0RFGBAAWP0/yQZ/SqL+2PlbFv82Yg== --osd-uuid 373746ed-55f2-4403-b3a9-cb8f9593f9a7 2026-03-08T22:54:33.576 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:33.574+0000 7f02ddaed780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:33.576 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:33.576+0000 7f02ddaed780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:33.579 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:33.580+0000 7f02ddaed780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:33.579 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:33.580+0000 7f02ddaed780 -1 bdev(0x563faa3d9c00 td/crush-choose-args/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:54:33.580 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:33.580+0000 7f02ddaed780 -1 bluestore(td/crush-choose-args/0) _read_fsid unparsable uuid 2026-03-08T22:54:35.708 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-choose-args/0/keyring 2026-03-08T22:54:35.708 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:54:35.709 INFO:tasks.workunit.client.0.vm10.stdout:adding osd0 key to auth repository 2026-03-08T22:54:35.709 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:54:35.709 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-choose-args/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:54:35.856 INFO:tasks.workunit.client.0.vm10.stdout:start osd.0 2026-03-08T22:54:35.856 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:54:35.856 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:54:35.857 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:54:35.860 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:54:35.866 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/0 --osd-journal=td/crush-choose-args/0/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:54:35.913 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:35.910+0000 7f00d2fd4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:35.923 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:35.924+0000 7f00d2fd4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:35.928 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:35.929+0000 7f00d2fd4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:36.032 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:54:36.032 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:54:36.032 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:54:36.032 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:54:36.032 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:54:36.032 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:54:36.032 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:36.033 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:54:36.033 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:36.033 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:36.178 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:37.180 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:37.180 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:37.180 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:54:37.180 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:54:37.180 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:37.180 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:37.281 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:37.282+0000 7f00d2fd4780 -1 Falling back to public interface 2026-03-08T22:54:37.405 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:38.148 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:38.149+0000 7f00d2fd4780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:54:38.406 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:38.406 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:38.406 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:54:38.406 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:54:38.408 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:38.408 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:38.645 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:39.263 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:39.264+0000 7f00ce775640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:54:39.647 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:39.647 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:39.647 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:54:39.647 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:54:39.648 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:39.648 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:39.877 INFO:tasks.workunit.client.0.vm10.stdout:osd.0 up in weight 1 up_from 4 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/364103056,v1:127.0.0.1:6803/364103056] [v2:127.0.0.1:6804/364103056,v1:127.0.0.1:6805/364103056] exists,up 373746ed-55f2-4403-b3a9-cb8f9593f9a7 2026-03-08T22:54:39.878 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:54:39.878 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:54:39.878 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:54:39.878 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:55: TEST_choose_args_update: ceph osd set-require-min-compat-client luminous 2026-03-08T22:54:40.216 INFO:tasks.workunit.client.0.vm10.stderr:set require_min_compat_client to luminous 2026-03-08T22:54:40.229 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:56: TEST_choose_args_update: ceph osd getcrushmap 2026-03-08T22:54:40.447 INFO:tasks.workunit.client.0.vm10.stderr:2 2026-03-08T22:54:40.460 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:57: TEST_choose_args_update: crushtool -d td/crush-choose-args/map -o td/crush-choose-args/map.txt 2026-03-08T22:54:40.475 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:58: TEST_choose_args_update: sed -i -e '/end crush map/d' td/crush-choose-args/map.txt 2026-03-08T22:54:40.477 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:59: TEST_choose_args_update: cat 2026-03-08T22:54:40.478 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:82: TEST_choose_args_update: crushtool -c td/crush-choose-args/map.txt -o td/crush-choose-args/map-new 2026-03-08T22:54:40.492 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:83: TEST_choose_args_update: ceph osd setcrushmap -i td/crush-choose-args/map-new 2026-03-08T22:54:40.821 INFO:tasks.workunit.client.0.vm10.stderr:4 2026-03-08T22:54:40.835 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:84: TEST_choose_args_update: ceph osd crush tree 2026-03-08T22:54:41.059 INFO:tasks.workunit.client.0.vm10.stdout:ID CLASS WEIGHT 0 TYPE NAME 2026-03-08T22:54:41.059 INFO:tasks.workunit.client.0.vm10.stdout:-1 3.00000 root default 2026-03-08T22:54:41.059 INFO:tasks.workunit.client.0.vm10.stdout:-2 3.00000 2.00000 host HOST 2026-03-08T22:54:41.059 INFO:tasks.workunit.client.0.vm10.stdout: 0 3.00000 2.00000 osd.0 2026-03-08T22:54:41.069 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:86: TEST_choose_args_update: run_osd td/crush-choose-args 1 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-choose-args 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-choose-args/1 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false ' 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-choose-args/1' 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-choose-args/1/journal' 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-choose-args' 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:54:41.070 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:54:41.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:54:41.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:54:41.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:54:41.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:54:41.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-choose-args/$name.log' 2026-03-08T22:54:41.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-choose-args/$name.pid' 2026-03-08T22:54:41.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:54:41.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:54:41.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:54:41.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:54:41.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:54:41.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:54:41.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-choose-args/1 2026-03-08T22:54:41.072 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:54:41.073 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=51fcd408-af91-438c-98da-c30ea9dbdf6e 2026-03-08T22:54:41.073 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 51fcd408-af91-438c-98da-c30ea9dbdf6e' 2026-03-08T22:54:41.073 INFO:tasks.workunit.client.0.vm10.stdout:add osd1 51fcd408-af91-438c-98da-c30ea9dbdf6e 2026-03-08T22:54:41.074 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:54:41.088 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAx/q1p1FA7BRAA11sgwPkD/GlTVYZYvkBw5A== 2026-03-08T22:54:41.088 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAx/q1p1FA7BRAA11sgwPkD/GlTVYZYvkBw5A=="}' 2026-03-08T22:54:41.088 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 51fcd408-af91-438c-98da-c30ea9dbdf6e -i td/crush-choose-args/1/new.json 2026-03-08T22:54:41.323 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:54:41.335 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-choose-args/1/new.json 2026-03-08T22:54:41.336 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/1 --osd-journal=td/crush-choose-args/1/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAx/q1p1FA7BRAA11sgwPkD/GlTVYZYvkBw5A== --osd-uuid 51fcd408-af91-438c-98da-c30ea9dbdf6e 2026-03-08T22:54:41.357 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:41.357+0000 7efef020c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:41.358 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:41.359+0000 7efef020c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:41.360 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:41.360+0000 7efef020c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:41.360 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:41.360+0000 7efef020c780 -1 bdev(0x55e98c7c7c00 td/crush-choose-args/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:54:41.360 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:41.360+0000 7efef020c780 -1 bluestore(td/crush-choose-args/1) _read_fsid unparsable uuid 2026-03-08T22:54:44.068 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-choose-args/1/keyring 2026-03-08T22:54:44.069 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:54:44.070 INFO:tasks.workunit.client.0.vm10.stdout:adding osd1 key to auth repository 2026-03-08T22:54:44.070 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:54:44.070 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-choose-args/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:54:44.398 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:54:44.398 INFO:tasks.workunit.client.0.vm10.stdout:start osd.1 2026-03-08T22:54:44.398 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/1 --osd-journal=td/crush-choose-args/1/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:54:44.398 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:54:44.399 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:54:44.401 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:54:44.418 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:44.418+0000 7f388b72e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:44.426 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:44.427+0000 7f388b72e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:44.427 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:44.428+0000 7f388b72e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:44.640 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:54:44.641 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:54:44.641 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:54:44.641 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:54:44.641 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:54:44.641 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:44.641 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:54:44.641 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:54:44.641 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:44.641 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:54:44.874 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:45.753 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:45.753+0000 7f388b72e780 -1 Falling back to public interface 2026-03-08T22:54:45.875 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:45.875 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:45.875 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:54:45.875 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:54:45.876 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:45.876 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:54:46.101 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:46.631 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:46.631+0000 7f388b72e780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:54:47.103 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:54:47.103 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:47.103 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:47.103 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:54:47.103 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:47.103 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:54:47.351 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:48.354 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:54:48.354 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:48.354 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:48.354 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:54:48.354 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:48.354 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:54:48.612 INFO:tasks.workunit.client.0.vm10.stdout:osd.1 up in weight 1 up_from 12 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/1641361881,v1:127.0.0.1:6811/1641361881] [v2:127.0.0.1:6812/1641361881,v1:127.0.0.1:6813/1641361881] exists,up 51fcd408-af91-438c-98da-c30ea9dbdf6e 2026-03-08T22:54:48.612 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:54:48.612 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:54:48.612 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:54:48.612 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:87: TEST_choose_args_update: ceph osd crush tree 2026-03-08T22:54:48.834 INFO:tasks.workunit.client.0.vm10.stdout:ID CLASS WEIGHT 0 TYPE NAME 2026-03-08T22:54:48.834 INFO:tasks.workunit.client.0.vm10.stdout:-1 6.00000 root default 2026-03-08T22:54:48.834 INFO:tasks.workunit.client.0.vm10.stdout:-2 6.00000 5.00000 host HOST 2026-03-08T22:54:48.834 INFO:tasks.workunit.client.0.vm10.stdout: 0 3.00000 2.00000 osd.0 2026-03-08T22:54:48.834 INFO:tasks.workunit.client.0.vm10.stdout: 1 3.00000 3.00000 osd.1 2026-03-08T22:54:48.845 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:88: TEST_choose_args_update: ceph osd getcrushmap 2026-03-08T22:54:49.076 INFO:tasks.workunit.client.0.vm10.stderr:5 2026-03-08T22:54:49.089 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:89: TEST_choose_args_update: crushtool -d td/crush-choose-args/map-one-more -o td/crush-choose-args/map-one-more.txt 2026-03-08T22:54:49.108 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:90: TEST_choose_args_update: cat td/crush-choose-args/map-one-more.txt 2026-03-08T22:54:49.109 INFO:tasks.workunit.client.0.vm10.stdout:# begin crush map 2026-03-08T22:54:49.109 INFO:tasks.workunit.client.0.vm10.stdout:tunable choose_local_tries 0 2026-03-08T22:54:49.109 INFO:tasks.workunit.client.0.vm10.stdout:tunable choose_local_fallback_tries 0 2026-03-08T22:54:49.109 INFO:tasks.workunit.client.0.vm10.stdout:tunable choose_total_tries 50 2026-03-08T22:54:49.109 INFO:tasks.workunit.client.0.vm10.stdout:tunable chooseleaf_descend_once 1 2026-03-08T22:54:49.109 INFO:tasks.workunit.client.0.vm10.stdout:tunable chooseleaf_vary_r 1 2026-03-08T22:54:49.109 INFO:tasks.workunit.client.0.vm10.stdout:tunable chooseleaf_stable 1 2026-03-08T22:54:49.109 INFO:tasks.workunit.client.0.vm10.stdout:tunable straw_calc_version 1 2026-03-08T22:54:49.109 INFO:tasks.workunit.client.0.vm10.stdout:tunable allowed_bucket_algs 54 2026-03-08T22:54:49.109 INFO:tasks.workunit.client.0.vm10.stdout: 2026-03-08T22:54:49.109 INFO:tasks.workunit.client.0.vm10.stdout:# devices 2026-03-08T22:54:49.109 INFO:tasks.workunit.client.0.vm10.stdout:device 0 osd.0 2026-03-08T22:54:49.109 INFO:tasks.workunit.client.0.vm10.stdout:device 1 osd.1 2026-03-08T22:54:49.109 INFO:tasks.workunit.client.0.vm10.stdout: 2026-03-08T22:54:49.109 INFO:tasks.workunit.client.0.vm10.stdout:# types 2026-03-08T22:54:49.109 INFO:tasks.workunit.client.0.vm10.stdout:type 0 osd 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:type 1 host 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:type 2 chassis 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:type 3 rack 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:type 4 row 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:type 5 pdu 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:type 6 pod 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:type 7 room 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:type 8 datacenter 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:type 9 zone 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:type 10 region 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:type 11 root 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:# buckets 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:host HOST { 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: id -2 # do not change unnecessarily 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: # weight 6.00000 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: alg straw2 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: hash 0 # rjenkins1 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: item osd.0 weight 3.00000 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: item osd.1 weight 3.00000 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:} 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:root default { 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: id -1 # do not change unnecessarily 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: # weight 6.00000 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: alg straw2 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: hash 0 # rjenkins1 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: item HOST weight 6.00000 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:} 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:# rules 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:rule replicated_rule { 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: id 0 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: type replicated 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: step take default 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: step choose firstn 0 type osd 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: step emit 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:} 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:# choose_args 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout:choose_args 0 { 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: { 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: bucket_id -1 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: weight_set [ 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: [ 5.00000 ] 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: [ 5.00000 ] 2026-03-08T22:54:49.110 INFO:tasks.workunit.client.0.vm10.stdout: ] 2026-03-08T22:54:49.111 INFO:tasks.workunit.client.0.vm10.stdout: ids [ -10 ] 2026-03-08T22:54:49.111 INFO:tasks.workunit.client.0.vm10.stdout: } 2026-03-08T22:54:49.111 INFO:tasks.workunit.client.0.vm10.stdout: { 2026-03-08T22:54:49.111 INFO:tasks.workunit.client.0.vm10.stdout: bucket_id -2 2026-03-08T22:54:49.111 INFO:tasks.workunit.client.0.vm10.stdout: weight_set [ 2026-03-08T22:54:49.111 INFO:tasks.workunit.client.0.vm10.stdout: [ 2.00000 3.00000 ] 2026-03-08T22:54:49.111 INFO:tasks.workunit.client.0.vm10.stdout: [ 2.00000 3.00000 ] 2026-03-08T22:54:49.111 INFO:tasks.workunit.client.0.vm10.stdout: ] 2026-03-08T22:54:49.111 INFO:tasks.workunit.client.0.vm10.stdout: ids [ -20 1 ] 2026-03-08T22:54:49.111 INFO:tasks.workunit.client.0.vm10.stdout: } 2026-03-08T22:54:49.111 INFO:tasks.workunit.client.0.vm10.stdout:} 2026-03-08T22:54:49.111 INFO:tasks.workunit.client.0.vm10.stdout: 2026-03-08T22:54:49.111 INFO:tasks.workunit.client.0.vm10.stdout:# end crush map 2026-03-08T22:54:49.111 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:91: TEST_choose_args_update: diff -u td/crush-choose-args/map-one-more.txt /home/ubuntu/cephtest/clone.client.0/src/test/crush/crush-choose-args-expected-one-more-3.txt 2026-03-08T22:54:49.111 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:93: TEST_choose_args_update: destroy_osd td/crush-choose-args 1 2026-03-08T22:54:49.111 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:784: destroy_osd: local dir=td/crush-choose-args 2026-03-08T22:54:49.111 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:785: destroy_osd: local id=1 2026-03-08T22:54:49.111 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:787: destroy_osd: ceph osd out osd.1 2026-03-08T22:54:49.404 INFO:tasks.workunit.client.0.vm10.stderr:osd.1 is already out. 2026-03-08T22:54:49.415 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:788: destroy_osd: kill_daemons td/crush-choose-args TERM osd.1 2026-03-08T22:54:49.415 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:54:49.415 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:54:49.416 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:54:49.416 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:54:49.416 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:54:49.522 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:54:49.522 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:789: destroy_osd: ceph osd down osd.1 2026-03-08T22:54:49.750 INFO:tasks.workunit.client.0.vm10.stderr:osd.1 is already down. 2026-03-08T22:54:49.762 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:790: destroy_osd: ceph osd purge osd.1 --yes-i-really-mean-it 2026-03-08T22:54:49.977 INFO:tasks.workunit.client.0.vm10.stderr:osd.1 does not exist 2026-03-08T22:54:49.987 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:791: destroy_osd: teardown td/crush-choose-args/1 2026-03-08T22:54:49.987 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/crush-choose-args/1 2026-03-08T22:54:49.987 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:54:49.987 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/crush-choose-args/1 KILL 2026-03-08T22:54:49.987 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:54:49.987 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:54:49.987 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:54:49.987 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:54:49.987 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:54:49.989 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:54:49.989 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:54:49.990 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:54:49.990 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:54:49.991 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:54:49.991 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:54:49.991 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:54:49.992 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:54:49.992 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:54:49.992 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:54:49.993 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:54:49.993 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:54:49.994 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:54:49.994 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/crush-choose-args/1 2026-03-08T22:54:49.997 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:54:49.997 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:49.997 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:54:49.997 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50888 2026-03-08T22:54:49.999 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:54:49.999 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:54:49.999 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:792: destroy_osd: rm -fr td/crush-choose-args/1 2026-03-08T22:54:50.000 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:94: TEST_choose_args_update: ceph osd crush tree 2026-03-08T22:54:50.228 INFO:tasks.workunit.client.0.vm10.stdout:ID CLASS WEIGHT 0 TYPE NAME 2026-03-08T22:54:50.228 INFO:tasks.workunit.client.0.vm10.stdout:-1 3.00000 root default 2026-03-08T22:54:50.228 INFO:tasks.workunit.client.0.vm10.stdout:-2 3.00000 2.00000 host HOST 2026-03-08T22:54:50.228 INFO:tasks.workunit.client.0.vm10.stdout: 0 3.00000 2.00000 osd.0 2026-03-08T22:54:50.240 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:95: TEST_choose_args_update: ceph osd getcrushmap 2026-03-08T22:54:50.471 INFO:tasks.workunit.client.0.vm10.stderr:6 2026-03-08T22:54:50.482 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:96: TEST_choose_args_update: crushtool -d td/crush-choose-args/map-one-less -o td/crush-choose-args/map-one-less.txt 2026-03-08T22:54:50.496 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:97: TEST_choose_args_update: diff -u td/crush-choose-args/map-one-less.txt td/crush-choose-args/map.txt 2026-03-08T22:54:50.498 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:41: run: teardown td/crush-choose-args 2026-03-08T22:54:50.498 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/crush-choose-args 2026-03-08T22:54:50.498 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:54:50.498 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/crush-choose-args KILL 2026-03-08T22:54:50.498 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:54:50.498 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:54:50.498 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:54:50.498 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:54:50.498 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:54:50.613 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:54:50.613 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:54:50.614 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:54:50.614 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:54:50.615 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:54:50.615 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:54:50.615 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:54:50.616 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:54:50.616 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:54:50.617 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:54:50.617 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:54:50.617 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:54:50.619 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:54:50.619 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/crush-choose-args 2026-03-08T22:54:50.626 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:54:50.626 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:50.626 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:54:50.627 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50888 2026-03-08T22:54:50.627 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:54:50.628 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:54:50.628 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:38: run: for func in $funcs 2026-03-08T22:54:50.628 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:39: run: setup td/crush-choose-args 2026-03-08T22:54:50.628 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/crush-choose-args 2026-03-08T22:54:50.628 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/crush-choose-args 2026-03-08T22:54:50.628 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/crush-choose-args 2026-03-08T22:54:50.628 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:54:50.628 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/crush-choose-args KILL 2026-03-08T22:54:50.628 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:54:50.628 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:54:50.628 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:54:50.628 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:54:50.628 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:54:50.629 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:54:50.630 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:54:50.630 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:54:50.631 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:54:50.631 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:54:50.632 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:54:50.632 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:54:50.632 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:54:50.632 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:54:50.633 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:54:50.633 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:54:50.634 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:54:50.634 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:54:50.635 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/crush-choose-args 2026-03-08T22:54:50.636 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:54:50.636 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:50.636 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:54:50.636 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50888 2026-03-08T22:54:50.637 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:54:50.637 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:54:50.637 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/crush-choose-args 2026-03-08T22:54:50.638 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:54:50.638 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:50.638 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:54:50.638 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.50888 2026-03-08T22:54:50.639 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:54:50.639 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:54:50.639 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:54:50.639 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/crush-choose-args 1' TERM HUP INT 2026-03-08T22:54:50.639 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:40: run: TEST_move_bucket td/crush-choose-args 2026-03-08T22:54:50.639 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:195: TEST_move_bucket: local dir=td/crush-choose-args 2026-03-08T22:54:50.639 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:197: TEST_move_bucket: run_mon td/crush-choose-args a 2026-03-08T22:54:50.639 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/crush-choose-args 2026-03-08T22:54:50.640 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:54:50.640 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:54:50.640 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:54:50.640 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/crush-choose-args/a 2026-03-08T22:54:50.640 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/crush-choose-args/a --run-dir=td/crush-choose-args 2026-03-08T22:54:50.670 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:54:50.670 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:50.670 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:50.670 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:50.670 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:50.670 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:54:50.670 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:54:50.671 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/crush-choose-args/a '--log-file=td/crush-choose-args/$name.log' '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --mon-cluster-log-file=td/crush-choose-args/log --run-dir=td/crush-choose-args '--pid-file=td/crush-choose-args/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:54:50.703 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:54:50.704 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:54:50.704 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:54:50.704 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:54:50.704 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:54:50.705 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:54:50.705 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:54:50.705 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:54:50.705 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:54:50.706 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:54:50.706 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:50.707 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:54:50.707 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50888/ceph-mon.a.asok 2026-03-08T22:54:50.707 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:54:50.707 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50888/ceph-mon.a.asok config get fsid 2026-03-08T22:54:50.761 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:54:50.761 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:54:50.761 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:54:50.761 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:54:50.761 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:54:50.762 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:54:50.762 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:54:50.762 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:54:50.762 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:54:50.762 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:50.762 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:54:50.762 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50888/ceph-mon.a.asok 2026-03-08T22:54:50.762 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:54:50.762 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50888/ceph-mon.a.asok config get mon_host 2026-03-08T22:54:50.816 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:198: TEST_move_bucket: run_mgr td/crush-choose-args x 2026-03-08T22:54:50.816 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/crush-choose-args 2026-03-08T22:54:50.816 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:54:50.816 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:54:50.816 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:54:50.816 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/crush-choose-args/x 2026-03-08T22:54:50.816 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:54:50.933 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:54:50.933 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:50.933 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:50.933 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:50.933 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:50.933 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:54:50.933 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:54:50.934 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:54:50.934 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/crush-choose-args/x '--log-file=td/crush-choose-args/$name.log' '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --run-dir=td/crush-choose-args '--pid-file=td/crush-choose-args/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:54:50.956 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:199: TEST_move_bucket: run_osd td/crush-choose-args 0 2026-03-08T22:54:50.956 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-choose-args 2026-03-08T22:54:50.956 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:54:50.956 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:54:50.956 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:54:50.956 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-choose-args/0 2026-03-08T22:54:50.956 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false ' 2026-03-08T22:54:50.956 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:54:50.956 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:54:50.956 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:54:50.956 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-choose-args/0' 2026-03-08T22:54:50.956 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-choose-args/0/journal' 2026-03-08T22:54:50.956 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:54:50.957 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:54:50.957 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-choose-args' 2026-03-08T22:54:50.957 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:54:50.957 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:50.957 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:50.958 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:50.958 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:50.958 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:54:50.958 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:54:50.958 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:54:50.958 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:54:50.958 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:54:50.958 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:54:50.958 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-choose-args/$name.log' 2026-03-08T22:54:50.958 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-choose-args/$name.pid' 2026-03-08T22:54:50.958 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:54:50.958 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:54:50.958 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:54:50.958 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:54:50.958 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:54:50.958 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:54:50.958 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-choose-args/0 2026-03-08T22:54:50.959 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:54:50.960 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=a21967c6-246f-4f0a-b678-319d97e6bc59 2026-03-08T22:54:50.960 INFO:tasks.workunit.client.0.vm10.stdout:add osd0 a21967c6-246f-4f0a-b678-319d97e6bc59 2026-03-08T22:54:50.960 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 a21967c6-246f-4f0a-b678-319d97e6bc59' 2026-03-08T22:54:50.960 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:54:50.976 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQA6/q1pIPQjOhAAJnDgdlNiSLxi6JIt3D/saA== 2026-03-08T22:54:50.976 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQA6/q1pIPQjOhAAJnDgdlNiSLxi6JIt3D/saA=="}' 2026-03-08T22:54:50.976 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new a21967c6-246f-4f0a-b678-319d97e6bc59 -i td/crush-choose-args/0/new.json 2026-03-08T22:54:51.108 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:54:51.119 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-choose-args/0/new.json 2026-03-08T22:54:51.120 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/0 --osd-journal=td/crush-choose-args/0/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQA6/q1pIPQjOhAAJnDgdlNiSLxi6JIt3D/saA== --osd-uuid a21967c6-246f-4f0a-b678-319d97e6bc59 2026-03-08T22:54:51.141 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:51.141+0000 7f797dac2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:51.143 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:51.143+0000 7f797dac2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:51.145 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:51.145+0000 7f797dac2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:51.145 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:51.146+0000 7f797dac2780 -1 bdev(0x556b68d23c00 td/crush-choose-args/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:54:51.145 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:51.146+0000 7f797dac2780 -1 bluestore(td/crush-choose-args/0) _read_fsid unparsable uuid 2026-03-08T22:54:53.408 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-choose-args/0/keyring 2026-03-08T22:54:53.408 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:54:53.409 INFO:tasks.workunit.client.0.vm10.stdout:adding osd0 key to auth repository 2026-03-08T22:54:53.409 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:54:53.409 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-choose-args/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:54:53.738 INFO:tasks.workunit.client.0.vm10.stdout:start osd.0 2026-03-08T22:54:53.738 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:54:53.738 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/0 --osd-journal=td/crush-choose-args/0/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:54:53.738 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:54:53.739 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:54:53.741 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:54:53.761 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:53.759+0000 7f29269ae780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:53.761 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:53.761+0000 7f29269ae780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:53.763 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:53.763+0000 7f29269ae780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:53.986 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:54:53.986 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:54:53.986 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:54:53.986 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:54:53.986 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:54:53.986 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:53.986 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:54:53.986 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:54:53.986 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:53.986 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:54.217 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:55.085 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:55.086+0000 7f29269ae780 -1 Falling back to public interface 2026-03-08T22:54:55.218 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:54:55.218 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:55.218 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:55.218 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:54:55.219 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:55.219 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:55.454 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:56.214 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:56.214+0000 7f29269ae780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:54:56.455 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:56.455 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:56.455 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:54:56.455 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:54:56.456 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:56.456 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:56.701 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:57.703 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:54:57.703 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:57.703 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:57.703 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:54:57.703 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:57.703 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:57.959 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:58.962 INFO:tasks.workunit.client.0.vm10.stdout:4 2026-03-08T22:54:58.962 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:58.962 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:58.962 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:54:58.962 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:58.962 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:59.199 INFO:tasks.workunit.client.0.vm10.stdout:osd.0 up in weight 1 up_from 4 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/1755899558,v1:127.0.0.1:6803/1755899558] [v2:127.0.0.1:6804/1755899558,v1:127.0.0.1:6805/1755899558] exists,up a21967c6-246f-4f0a-b678-319d97e6bc59 2026-03-08T22:54:59.200 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:54:59.200 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:54:59.200 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:54:59.200 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:200: TEST_move_bucket: run_osd td/crush-choose-args 1 2026-03-08T22:54:59.200 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-choose-args 2026-03-08T22:54:59.200 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:54:59.200 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:54:59.200 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:54:59.200 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-choose-args/1 2026-03-08T22:54:59.200 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false ' 2026-03-08T22:54:59.200 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:54:59.200 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:54:59.200 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:54:59.200 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-choose-args/1' 2026-03-08T22:54:59.200 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-choose-args/1/journal' 2026-03-08T22:54:59.200 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:54:59.200 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:54:59.200 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-choose-args' 2026-03-08T22:54:59.201 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:54:59.201 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:59.201 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:59.201 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:59.201 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:59.201 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:54:59.201 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:54:59.201 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:54:59.201 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:54:59.202 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:54:59.202 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:54:59.202 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-choose-args/$name.log' 2026-03-08T22:54:59.202 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-choose-args/$name.pid' 2026-03-08T22:54:59.202 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:54:59.202 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:54:59.202 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:54:59.202 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:54:59.202 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:54:59.202 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:54:59.202 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-choose-args/1 2026-03-08T22:54:59.203 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:54:59.204 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=a1c47c62-1965-4985-aaec-76c8b0a93f66 2026-03-08T22:54:59.204 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 a1c47c62-1965-4985-aaec-76c8b0a93f66' 2026-03-08T22:54:59.204 INFO:tasks.workunit.client.0.vm10.stdout:add osd1 a1c47c62-1965-4985-aaec-76c8b0a93f66 2026-03-08T22:54:59.204 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:54:59.220 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBD/q1paIweDRAAKTCaZvmIGMw+5VyiAr4qWg== 2026-03-08T22:54:59.220 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBD/q1paIweDRAAKTCaZvmIGMw+5VyiAr4qWg=="}' 2026-03-08T22:54:59.220 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new a1c47c62-1965-4985-aaec-76c8b0a93f66 -i td/crush-choose-args/1/new.json 2026-03-08T22:54:59.483 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:54:59.496 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-choose-args/1/new.json 2026-03-08T22:54:59.497 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/1 --osd-journal=td/crush-choose-args/1/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBD/q1paIweDRAAKTCaZvmIGMw+5VyiAr4qWg== --osd-uuid a1c47c62-1965-4985-aaec-76c8b0a93f66 2026-03-08T22:54:59.521 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:59.521+0000 7f35d3dd4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:59.523 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:59.524+0000 7f35d3dd4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:59.525 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:59.525+0000 7f35d3dd4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:59.525 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:59.526+0000 7f35d3dd4780 -1 bdev(0x564486651c00 td/crush-choose-args/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:54:59.525 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:54:59.526+0000 7f35d3dd4780 -1 bluestore(td/crush-choose-args/1) _read_fsid unparsable uuid 2026-03-08T22:55:03.195 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-choose-args/1/keyring 2026-03-08T22:55:03.195 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:55:03.197 INFO:tasks.workunit.client.0.vm10.stdout:adding osd1 key to auth repository 2026-03-08T22:55:03.197 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:55:03.197 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-choose-args/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:55:03.596 INFO:tasks.workunit.client.0.vm10.stdout:start osd.1 2026-03-08T22:55:03.597 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:55:03.597 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/1 --osd-journal=td/crush-choose-args/1/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:55:03.597 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:55:03.597 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:55:03.600 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:55:03.619 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:03.617+0000 7f0030410780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:03.620 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:03.620+0000 7f0030410780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:03.621 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:03.622+0000 7f0030410780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:04.018 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:55:04.018 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:55:04.018 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:55:04.018 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:55:04.018 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:55:04.018 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:04.018 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:55:04.019 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:55:04.019 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:04.019 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:04.260 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:04.451 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:04.451+0000 7f0030410780 -1 Falling back to public interface 2026-03-08T22:55:05.261 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:05.261 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:05.261 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:55:05.261 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:55:05.262 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:05.262 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:05.492 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:06.065 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:06.065+0000 7f0030410780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:55:06.493 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:55:06.493 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:06.493 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:06.493 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:55:06.494 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:06.494 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:06.776 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:07.779 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:55:07.779 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:07.779 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:07.779 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:55:07.779 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:07.779 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:08.017 INFO:tasks.workunit.client.0.vm10.stdout:osd.1 up in weight 1 up_from 8 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/1828177320,v1:127.0.0.1:6811/1828177320] [v2:127.0.0.1:6812/1828177320,v1:127.0.0.1:6813/1828177320] exists,up a1c47c62-1965-4985-aaec-76c8b0a93f66 2026-03-08T22:55:08.017 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:55:08.017 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:55:08.017 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:55:08.017 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:202: TEST_move_bucket: ceph osd crush weight-set create-compat 2026-03-08T22:55:08.268 INFO:tasks.workunit.client.0.vm10.stderr:compat weight-set already created 2026-03-08T22:55:08.279 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:203: TEST_move_bucket: ceph osd crush weight-set reweight-compat osd.0 2 2026-03-08T22:55:08.617 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:204: TEST_move_bucket: ceph osd crush weight-set reweight-compat osd.1 2 2026-03-08T22:55:08.936 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:205: TEST_move_bucket: ceph osd crush tree 2026-03-08T22:55:09.162 INFO:tasks.workunit.client.0.vm10.stdout:ID CLASS WEIGHT (compat) TYPE NAME 2026-03-08T22:55:09.162 INFO:tasks.workunit.client.0.vm10.stdout:-1 6.00000 root default 2026-03-08T22:55:09.162 INFO:tasks.workunit.client.0.vm10.stdout:-2 6.00000 4.00000 host HOST 2026-03-08T22:55:09.162 INFO:tasks.workunit.client.0.vm10.stdout: 0 3.00000 2.00000 osd.0 2026-03-08T22:55:09.162 INFO:tasks.workunit.client.0.vm10.stdout: 1 3.00000 2.00000 osd.1 2026-03-08T22:55:09.173 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:206: TEST_move_bucket: ceph osd crush tree 2026-03-08T22:55:09.174 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:206: TEST_move_bucket: grep '6.00000 4.00000' 2026-03-08T22:55:09.175 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:206: TEST_move_bucket: grep HOST 2026-03-08T22:55:09.418 INFO:tasks.workunit.client.0.vm10.stdout:-2 6.00000 4.00000 host HOST 2026-03-08T22:55:09.418 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:209: TEST_move_bucket: ceph osd crush add-bucket RACK rack root=default 2026-03-08T22:55:09.661 INFO:tasks.workunit.client.0.vm10.stderr:bucket 'RACK' already exists 2026-03-08T22:55:09.672 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:210: TEST_move_bucket: ceph osd crush move HOST rack=RACK 2026-03-08T22:55:09.971 INFO:tasks.workunit.client.0.vm10.stderr:no need to move item id -2 name 'HOST' to location {rack=RACK} in crush map 2026-03-08T22:55:09.982 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:211: TEST_move_bucket: ceph osd crush tree 2026-03-08T22:55:10.215 INFO:tasks.workunit.client.0.vm10.stdout:ID CLASS WEIGHT (compat) TYPE NAME 2026-03-08T22:55:10.215 INFO:tasks.workunit.client.0.vm10.stdout:-1 6.00000 root default 2026-03-08T22:55:10.215 INFO:tasks.workunit.client.0.vm10.stdout:-3 6.00000 4.00000 rack RACK 2026-03-08T22:55:10.215 INFO:tasks.workunit.client.0.vm10.stdout:-2 6.00000 4.00000 host HOST 2026-03-08T22:55:10.215 INFO:tasks.workunit.client.0.vm10.stdout: 0 3.00000 2.00000 osd.0 2026-03-08T22:55:10.215 INFO:tasks.workunit.client.0.vm10.stdout: 1 3.00000 2.00000 osd.1 2026-03-08T22:55:10.227 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:212: TEST_move_bucket: ceph osd crush tree 2026-03-08T22:55:10.227 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:212: TEST_move_bucket: grep HOST 2026-03-08T22:55:10.227 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:212: TEST_move_bucket: grep '6.00000 4.00000' 2026-03-08T22:55:10.463 INFO:tasks.workunit.client.0.vm10.stdout:-2 6.00000 4.00000 host HOST 2026-03-08T22:55:10.464 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:213: TEST_move_bucket: ceph osd crush tree 2026-03-08T22:55:10.464 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:213: TEST_move_bucket: grep '6.00000 4.00000' 2026-03-08T22:55:10.465 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:213: TEST_move_bucket: grep RACK 2026-03-08T22:55:10.711 INFO:tasks.workunit.client.0.vm10.stdout:-3 6.00000 4.00000 rack RACK 2026-03-08T22:55:10.711 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:216: TEST_move_bucket: ceph osd crush weight-set reweight-compat osd.0 1 2026-03-08T22:55:11.002 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:217: TEST_move_bucket: ceph osd crush tree 2026-03-08T22:55:11.224 INFO:tasks.workunit.client.0.vm10.stdout:ID CLASS WEIGHT (compat) TYPE NAME 2026-03-08T22:55:11.224 INFO:tasks.workunit.client.0.vm10.stdout:-1 6.00000 root default 2026-03-08T22:55:11.225 INFO:tasks.workunit.client.0.vm10.stdout:-3 6.00000 3.00000 rack RACK 2026-03-08T22:55:11.225 INFO:tasks.workunit.client.0.vm10.stdout:-2 6.00000 3.00000 host HOST 2026-03-08T22:55:11.225 INFO:tasks.workunit.client.0.vm10.stdout: 0 3.00000 1.00000 osd.0 2026-03-08T22:55:11.225 INFO:tasks.workunit.client.0.vm10.stdout: 1 3.00000 2.00000 osd.1 2026-03-08T22:55:11.236 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:218: TEST_move_bucket: ceph osd crush tree 2026-03-08T22:55:11.236 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:218: TEST_move_bucket: grep '6.00000 3.00000' 2026-03-08T22:55:11.237 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:218: TEST_move_bucket: grep HOST 2026-03-08T22:55:11.498 INFO:tasks.workunit.client.0.vm10.stdout:-2 6.00000 3.00000 host HOST 2026-03-08T22:55:11.499 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:219: TEST_move_bucket: ceph osd crush tree 2026-03-08T22:55:11.500 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:219: TEST_move_bucket: grep RACK 2026-03-08T22:55:11.502 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:219: TEST_move_bucket: grep '6.00000 3.00000' 2026-03-08T22:55:11.741 INFO:tasks.workunit.client.0.vm10.stdout:-3 6.00000 3.00000 rack RACK 2026-03-08T22:55:11.742 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:222: TEST_move_bucket: ceph config set mon osd_crush_update_weight_set true 2026-03-08T22:55:11.987 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:223: TEST_move_bucket: ceph osd crush add-bucket FOO host root=default 2026-03-08T22:55:12.279 INFO:tasks.workunit.client.0.vm10.stderr:bucket 'FOO' already exists 2026-03-08T22:55:12.290 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:224: TEST_move_bucket: ceph osd crush move osd.0 host=FOO 2026-03-08T22:55:12.567 INFO:tasks.workunit.client.0.vm10.stderr:no need to move item id 0 name 'osd.0' to location {host=FOO} in crush map 2026-03-08T22:55:12.576 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:225: TEST_move_bucket: ceph osd crush tree 2026-03-08T22:55:12.804 INFO:tasks.workunit.client.0.vm10.stdout:ID CLASS WEIGHT (compat) TYPE NAME 2026-03-08T22:55:12.804 INFO:tasks.workunit.client.0.vm10.stdout:-1 6.00000 root default 2026-03-08T22:55:12.805 INFO:tasks.workunit.client.0.vm10.stdout:-4 3.00000 3.00000 host FOO 2026-03-08T22:55:12.805 INFO:tasks.workunit.client.0.vm10.stdout: 0 3.00000 3.00000 osd.0 2026-03-08T22:55:12.805 INFO:tasks.workunit.client.0.vm10.stdout:-3 3.00000 2.00000 rack RACK 2026-03-08T22:55:12.805 INFO:tasks.workunit.client.0.vm10.stdout:-2 3.00000 2.00000 host HOST 2026-03-08T22:55:12.805 INFO:tasks.workunit.client.0.vm10.stdout: 1 3.00000 2.00000 osd.1 2026-03-08T22:55:12.817 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:226: TEST_move_bucket: ceph osd crush tree 2026-03-08T22:55:12.817 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:226: TEST_move_bucket: grep '3.00000 3.00000' 2026-03-08T22:55:12.818 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:226: TEST_move_bucket: grep osd.0 2026-03-08T22:55:13.056 INFO:tasks.workunit.client.0.vm10.stdout: 0 3.00000 3.00000 osd.0 2026-03-08T22:55:13.056 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:227: TEST_move_bucket: ceph osd crush tree 2026-03-08T22:55:13.056 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:227: TEST_move_bucket: grep '3.00000 2.00000' 2026-03-08T22:55:13.058 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:227: TEST_move_bucket: grep HOST 2026-03-08T22:55:13.292 INFO:tasks.workunit.client.0.vm10.stdout:-2 3.00000 2.00000 host HOST 2026-03-08T22:55:13.292 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:228: TEST_move_bucket: ceph osd crush tree 2026-03-08T22:55:13.293 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:228: TEST_move_bucket: grep '3.00000 2.00000' 2026-03-08T22:55:13.294 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:228: TEST_move_bucket: grep RACK 2026-03-08T22:55:13.546 INFO:tasks.workunit.client.0.vm10.stdout:-3 3.00000 2.00000 rack RACK 2026-03-08T22:55:13.546 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:231: TEST_move_bucket: ceph config set mon osd_crush_update_weight_set false 2026-03-08T22:55:13.778 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:232: TEST_move_bucket: ceph osd crush move osd.1 host=FOO 2026-03-08T22:55:14.065 INFO:tasks.workunit.client.0.vm10.stderr:no need to move item id 1 name 'osd.1' to location {host=FOO} in crush map 2026-03-08T22:55:14.074 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:233: TEST_move_bucket: ceph osd crush tree 2026-03-08T22:55:14.294 INFO:tasks.workunit.client.0.vm10.stdout:ID CLASS WEIGHT (compat) TYPE NAME 2026-03-08T22:55:14.294 INFO:tasks.workunit.client.0.vm10.stdout:-1 6.00000 root default 2026-03-08T22:55:14.294 INFO:tasks.workunit.client.0.vm10.stdout:-4 6.00000 3.00000 host FOO 2026-03-08T22:55:14.294 INFO:tasks.workunit.client.0.vm10.stdout: 0 3.00000 3.00000 osd.0 2026-03-08T22:55:14.294 INFO:tasks.workunit.client.0.vm10.stdout: 1 3.00000 0 osd.1 2026-03-08T22:55:14.294 INFO:tasks.workunit.client.0.vm10.stdout:-3 0 0 rack RACK 2026-03-08T22:55:14.294 INFO:tasks.workunit.client.0.vm10.stdout:-2 0 0 host HOST 2026-03-08T22:55:14.305 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:234: TEST_move_bucket: ceph osd crush tree 2026-03-08T22:55:14.305 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:234: TEST_move_bucket: grep '3.00000 3.00000' 2026-03-08T22:55:14.306 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:234: TEST_move_bucket: grep osd.0 2026-03-08T22:55:14.545 INFO:tasks.workunit.client.0.vm10.stdout: 0 3.00000 3.00000 osd.0 2026-03-08T22:55:14.546 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:235: TEST_move_bucket: ceph osd crush tree 2026-03-08T22:55:14.546 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:235: TEST_move_bucket: grep '3.00000 0' 2026-03-08T22:55:14.547 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:235: TEST_move_bucket: grep osd.1 2026-03-08T22:55:14.777 INFO:tasks.workunit.client.0.vm10.stdout: 1 3.00000 0 osd.1 2026-03-08T22:55:14.778 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:236: TEST_move_bucket: ceph osd crush tree 2026-03-08T22:55:14.778 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:236: TEST_move_bucket: grep '6.00000 3.00000' 2026-03-08T22:55:14.779 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:236: TEST_move_bucket: grep FOO 2026-03-08T22:55:15.014 INFO:tasks.workunit.client.0.vm10.stdout:-4 6.00000 3.00000 host FOO 2026-03-08T22:55:15.014 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:41: run: teardown td/crush-choose-args 2026-03-08T22:55:15.014 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/crush-choose-args 2026-03-08T22:55:15.014 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:55:15.014 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/crush-choose-args KILL 2026-03-08T22:55:15.015 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:55:15.015 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:55:15.015 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:55:15.015 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:55:15.015 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:55:15.127 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:55:15.127 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:55:15.128 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:55:15.128 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:55:15.129 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:55:15.129 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:55:15.130 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:55:15.131 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:55:15.131 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:55:15.131 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:55:15.131 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:55:15.132 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:55:15.133 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:55:15.133 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/crush-choose-args 2026-03-08T22:55:15.143 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:55:15.143 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:15.144 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:15.144 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50888 2026-03-08T22:55:15.144 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:55:15.145 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:55:15.145 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:38: run: for func in $funcs 2026-03-08T22:55:15.145 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:39: run: setup td/crush-choose-args 2026-03-08T22:55:15.145 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/crush-choose-args 2026-03-08T22:55:15.145 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/crush-choose-args 2026-03-08T22:55:15.145 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/crush-choose-args 2026-03-08T22:55:15.145 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:55:15.145 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/crush-choose-args KILL 2026-03-08T22:55:15.145 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:55:15.145 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:55:15.145 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:55:15.145 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:55:15.145 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:55:15.148 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:55:15.148 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:55:15.149 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:55:15.149 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:55:15.150 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:55:15.150 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:55:15.150 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:55:15.151 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:55:15.151 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:55:15.151 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:55:15.151 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:55:15.152 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:55:15.153 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:55:15.153 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/crush-choose-args 2026-03-08T22:55:15.154 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:55:15.154 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:15.154 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:15.154 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50888 2026-03-08T22:55:15.155 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:55:15.155 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:55:15.155 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/crush-choose-args 2026-03-08T22:55:15.156 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:55:15.156 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:15.156 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:15.156 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.50888 2026-03-08T22:55:15.157 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:55:15.158 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:55:15.158 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:55:15.158 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/crush-choose-args 1' TERM HUP INT 2026-03-08T22:55:15.158 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:40: run: TEST_no_update_weight_set td/crush-choose-args 2026-03-08T22:55:15.158 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:104: TEST_no_update_weight_set: local dir=td/crush-choose-args 2026-03-08T22:55:15.158 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:106: TEST_no_update_weight_set: ORIG_CEPH_ARGS='--fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false ' 2026-03-08T22:55:15.158 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:107: TEST_no_update_weight_set: CEPH_ARGS+='--osd-crush-update-weight-set=false ' 2026-03-08T22:55:15.158 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:109: TEST_no_update_weight_set: run_mon td/crush-choose-args a 2026-03-08T22:55:15.158 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/crush-choose-args 2026-03-08T22:55:15.158 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:55:15.158 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:55:15.158 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:55:15.158 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/crush-choose-args/a 2026-03-08T22:55:15.158 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/crush-choose-args/a --run-dir=td/crush-choose-args 2026-03-08T22:55:15.239 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:55:15.239 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:15.239 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:15.239 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:15.239 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:15.239 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:15.239 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:55:15.240 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/crush-choose-args/a '--log-file=td/crush-choose-args/$name.log' '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --mon-cluster-log-file=td/crush-choose-args/log --run-dir=td/crush-choose-args '--pid-file=td/crush-choose-args/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:55:15.290 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:55:15.290 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:55:15.290 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:55:15.290 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:55:15.290 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:55:15.290 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:55:15.290 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:55:15.290 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:55:15.290 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:55:15.291 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:55:15.291 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:15.291 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:15.292 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50888/ceph-mon.a.asok 2026-03-08T22:55:15.292 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:55:15.292 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50888/ceph-mon.a.asok config get fsid 2026-03-08T22:55:15.347 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:55:15.347 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:55:15.347 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:55:15.347 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:55:15.347 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:55:15.350 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:55:15.350 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:55:15.350 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:55:15.350 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:55:15.350 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:15.350 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:15.350 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50888/ceph-mon.a.asok 2026-03-08T22:55:15.350 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:55:15.350 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50888/ceph-mon.a.asok config get mon_host 2026-03-08T22:55:15.408 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:110: TEST_no_update_weight_set: run_mgr td/crush-choose-args x 2026-03-08T22:55:15.409 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/crush-choose-args 2026-03-08T22:55:15.409 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:55:15.409 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:55:15.409 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:55:15.409 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/crush-choose-args/x 2026-03-08T22:55:15.409 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:55:15.544 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:55:15.544 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:15.544 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:15.545 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:15.545 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:15.545 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:15.545 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:55:15.546 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:55:15.547 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/crush-choose-args/x '--log-file=td/crush-choose-args/$name.log' '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --run-dir=td/crush-choose-args '--pid-file=td/crush-choose-args/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:55:15.573 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:111: TEST_no_update_weight_set: run_osd td/crush-choose-args 0 2026-03-08T22:55:15.573 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-choose-args 2026-03-08T22:55:15.575 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:55:15.575 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:55:15.575 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:55:15.575 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-choose-args/0 2026-03-08T22:55:15.575 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false ' 2026-03-08T22:55:15.575 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:55:15.575 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:55:15.575 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:55:15.575 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-choose-args/0' 2026-03-08T22:55:15.575 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-choose-args/0/journal' 2026-03-08T22:55:15.575 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:55:15.575 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:55:15.575 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-choose-args' 2026-03-08T22:55:15.575 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:55:15.575 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:15.575 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:15.576 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:15.576 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:15.576 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:15.576 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:55:15.577 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:55:15.577 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:55:15.577 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:55:15.577 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:55:15.577 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-choose-args/$name.log' 2026-03-08T22:55:15.577 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-choose-args/$name.pid' 2026-03-08T22:55:15.577 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:55:15.577 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:55:15.577 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:55:15.577 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:55:15.577 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:55:15.577 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:55:15.577 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-choose-args/0 2026-03-08T22:55:15.579 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:55:15.580 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=1adf403d-20dd-4e9c-9372-7c4889c39003 2026-03-08T22:55:15.580 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 1adf403d-20dd-4e9c-9372-7c4889c39003' 2026-03-08T22:55:15.580 INFO:tasks.workunit.client.0.vm10.stdout:add osd0 1adf403d-20dd-4e9c-9372-7c4889c39003 2026-03-08T22:55:15.580 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:55:15.598 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBT/q1pG8CAIxAAzueQwn/Gp6H6ovVBTZrAeA== 2026-03-08T22:55:15.598 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBT/q1pG8CAIxAAzueQwn/Gp6H6ovVBTZrAeA=="}' 2026-03-08T22:55:15.598 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 1adf403d-20dd-4e9c-9372-7c4889c39003 -i td/crush-choose-args/0/new.json 2026-03-08T22:55:15.731 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:55:15.740 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-choose-args/0/new.json 2026-03-08T22:55:15.740 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/0 --osd-journal=td/crush-choose-args/0/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBT/q1pG8CAIxAAzueQwn/Gp6H6ovVBTZrAeA== --osd-uuid 1adf403d-20dd-4e9c-9372-7c4889c39003 2026-03-08T22:55:15.763 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:15.762+0000 7f0ce032e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:15.767 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:15.768+0000 7f0ce032e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:15.773 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:15.773+0000 7f0ce032e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:15.773 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:15.773+0000 7f0ce032e780 -1 bdev(0x5628ecb1dc00 td/crush-choose-args/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:55:15.773 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:15.773+0000 7f0ce032e780 -1 bluestore(td/crush-choose-args/0) _read_fsid unparsable uuid 2026-03-08T22:55:18.048 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-choose-args/0/keyring 2026-03-08T22:55:18.048 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:55:18.048 INFO:tasks.workunit.client.0.vm10.stdout:adding osd0 key to auth repository 2026-03-08T22:55:18.048 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:55:18.049 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-choose-args/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:55:18.242 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:55:18.242 INFO:tasks.workunit.client.0.vm10.stdout:start osd.0 2026-03-08T22:55:18.242 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/0 --osd-journal=td/crush-choose-args/0/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:55:18.242 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:55:18.243 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:55:18.245 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:55:18.263 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:18.263+0000 7f143fc22780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:18.362 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:18.271+0000 7f143fc22780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:18.362 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:18.272+0000 7f143fc22780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:18.362 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:55:18.362 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:55:18.362 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:55:18.362 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:55:18.362 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:55:18.362 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:18.362 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:55:18.362 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:55:18.362 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:18.362 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:55:18.584 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:19.586 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:55:19.587 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:19.587 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:19.587 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:55:19.587 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:19.587 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:55:19.819 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:19.861 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:19.862+0000 7f143fc22780 -1 Falling back to public interface 2026-03-08T22:55:20.726 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:20.726+0000 7f143fc22780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:55:20.820 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:55:20.820 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:20.820 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:20.820 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:55:20.822 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:20.822 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:55:21.079 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:22.081 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:55:22.081 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:22.081 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:22.081 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:55:22.081 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:22.081 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:55:22.320 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:23.322 INFO:tasks.workunit.client.0.vm10.stdout:4 2026-03-08T22:55:23.322 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:23.322 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:23.322 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:55:23.322 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:23.322 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:55:23.541 INFO:tasks.workunit.client.0.vm10.stdout:osd.0 up in weight 1 up_from 4 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3812121045,v1:127.0.0.1:6803/3812121045] [v2:127.0.0.1:6804/3812121045,v1:127.0.0.1:6805/3812121045] exists,up 1adf403d-20dd-4e9c-9372-7c4889c39003 2026-03-08T22:55:23.541 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:55:23.541 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:55:23.541 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:55:23.541 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:113: TEST_no_update_weight_set: ceph osd set-require-min-compat-client luminous 2026-03-08T22:55:23.831 INFO:tasks.workunit.client.0.vm10.stderr:set require_min_compat_client to luminous 2026-03-08T22:55:23.843 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:114: TEST_no_update_weight_set: ceph osd crush tree 2026-03-08T22:55:24.061 INFO:tasks.workunit.client.0.vm10.stdout:ID CLASS WEIGHT TYPE NAME 2026-03-08T22:55:24.061 INFO:tasks.workunit.client.0.vm10.stdout:-1 3.00000 root default 2026-03-08T22:55:24.061 INFO:tasks.workunit.client.0.vm10.stdout:-2 3.00000 host HOST 2026-03-08T22:55:24.061 INFO:tasks.workunit.client.0.vm10.stdout: 0 3.00000 osd.0 2026-03-08T22:55:24.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:115: TEST_no_update_weight_set: ceph osd getcrushmap 2026-03-08T22:55:24.293 INFO:tasks.workunit.client.0.vm10.stderr:2 2026-03-08T22:55:24.305 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:116: TEST_no_update_weight_set: crushtool -d td/crush-choose-args/map -o td/crush-choose-args/map.txt 2026-03-08T22:55:24.323 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:117: TEST_no_update_weight_set: sed -i -e '/end crush map/d' td/crush-choose-args/map.txt 2026-03-08T22:55:24.324 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:118: TEST_no_update_weight_set: cat 2026-03-08T22:55:24.325 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:141: TEST_no_update_weight_set: crushtool -c td/crush-choose-args/map.txt -o td/crush-choose-args/map-new 2026-03-08T22:55:24.344 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:142: TEST_no_update_weight_set: ceph osd setcrushmap -i td/crush-choose-args/map-new 2026-03-08T22:55:24.644 INFO:tasks.workunit.client.0.vm10.stderr:4 2026-03-08T22:55:24.662 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:143: TEST_no_update_weight_set: ceph osd crush tree 2026-03-08T22:55:24.880 INFO:tasks.workunit.client.0.vm10.stdout:ID CLASS WEIGHT 0 TYPE NAME 2026-03-08T22:55:24.881 INFO:tasks.workunit.client.0.vm10.stdout:-1 3.00000 root default 2026-03-08T22:55:24.881 INFO:tasks.workunit.client.0.vm10.stdout:-2 3.00000 2.00000 host HOST 2026-03-08T22:55:24.881 INFO:tasks.workunit.client.0.vm10.stdout: 0 3.00000 2.00000 osd.0 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:146: TEST_no_update_weight_set: run_osd td/crush-choose-args 1 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-choose-args 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-choose-args/1 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false ' 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-choose-args/1' 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-choose-args/1/journal' 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-choose-args' 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:24.892 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:24.893 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:55:24.893 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:55:24.893 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:55:24.893 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:55:24.893 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:55:24.893 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-choose-args/$name.log' 2026-03-08T22:55:24.893 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-choose-args/$name.pid' 2026-03-08T22:55:24.893 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:55:24.893 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:55:24.893 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:55:24.893 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:55:24.893 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:55:24.893 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:55:24.893 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-choose-args/1 2026-03-08T22:55:24.895 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:55:24.895 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=77bc719c-798c-4c87-9812-82bc079694a2 2026-03-08T22:55:24.895 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 77bc719c-798c-4c87-9812-82bc079694a2' 2026-03-08T22:55:24.895 INFO:tasks.workunit.client.0.vm10.stdout:add osd1 77bc719c-798c-4c87-9812-82bc079694a2 2026-03-08T22:55:24.896 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:55:24.910 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBc/q1pGQ1HNhAAbgP80GeHePUWkUMpa9eUUg== 2026-03-08T22:55:24.911 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBc/q1pGQ1HNhAAbgP80GeHePUWkUMpa9eUUg=="}' 2026-03-08T22:55:24.911 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 77bc719c-798c-4c87-9812-82bc079694a2 -i td/crush-choose-args/1/new.json 2026-03-08T22:55:25.150 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:55:25.160 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-choose-args/1/new.json 2026-03-08T22:55:25.161 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/1 --osd-journal=td/crush-choose-args/1/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBc/q1pGQ1HNhAAbgP80GeHePUWkUMpa9eUUg== --osd-uuid 77bc719c-798c-4c87-9812-82bc079694a2 2026-03-08T22:55:25.183 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:25.184+0000 7fdd94c1a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:25.185 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:25.186+0000 7fdd94c1a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:25.186 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:25.187+0000 7fdd94c1a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:25.187 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:25.187+0000 7fdd94c1a780 -1 bdev(0x55708e5c5c00 td/crush-choose-args/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:55:25.187 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:25.188+0000 7fdd94c1a780 -1 bluestore(td/crush-choose-args/1) _read_fsid unparsable uuid 2026-03-08T22:55:27.865 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-choose-args/1/keyring 2026-03-08T22:55:27.865 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:55:27.866 INFO:tasks.workunit.client.0.vm10.stdout:adding osd1 key to auth repository 2026-03-08T22:55:27.866 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:55:27.866 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-choose-args/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:55:28.169 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:55:28.170 INFO:tasks.workunit.client.0.vm10.stdout:start osd.1 2026-03-08T22:55:28.170 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/1 --osd-journal=td/crush-choose-args/1/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:55:28.170 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:55:28.171 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:55:28.173 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:55:28.194 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:28.194+0000 7f09a4ed9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:28.201 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:28.202+0000 7f09a4ed9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:28.203 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:28.203+0000 7f09a4ed9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:28.406 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:55:28.407 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:55:28.407 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:55:28.407 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:55:28.407 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:55:28.407 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:28.407 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:55:28.407 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:55:28.407 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:28.407 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:28.644 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:29.530 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:29.530+0000 7f09a4ed9780 -1 Falling back to public interface 2026-03-08T22:55:29.645 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:29.645 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:29.645 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:55:29.645 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:55:29.647 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:29.647 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:29.896 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:30.397 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:30.398+0000 7f09a4ed9780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:55:30.899 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:55:30.899 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:30.899 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:30.899 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:55:30.899 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:30.899 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:31.145 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:32.146 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:55:32.146 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:32.146 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:32.146 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:55:32.147 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:32.147 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:32.370 INFO:tasks.workunit.client.0.vm10.stdout:osd.1 up in weight 1 up_from 12 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/3301817495,v1:127.0.0.1:6811/3301817495] [v2:127.0.0.1:6812/3301817495,v1:127.0.0.1:6813/3301817495] exists,up 77bc719c-798c-4c87-9812-82bc079694a2 2026-03-08T22:55:32.371 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:55:32.371 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:55:32.371 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:55:32.371 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:147: TEST_no_update_weight_set: ceph osd crush tree 2026-03-08T22:55:32.591 INFO:tasks.workunit.client.0.vm10.stdout:ID CLASS WEIGHT 0 TYPE NAME 2026-03-08T22:55:32.591 INFO:tasks.workunit.client.0.vm10.stdout:-1 6.00000 root default 2026-03-08T22:55:32.591 INFO:tasks.workunit.client.0.vm10.stdout:-2 6.00000 2.00000 host HOST 2026-03-08T22:55:32.591 INFO:tasks.workunit.client.0.vm10.stdout: 0 3.00000 2.00000 osd.0 2026-03-08T22:55:32.591 INFO:tasks.workunit.client.0.vm10.stdout: 1 3.00000 0 osd.1 2026-03-08T22:55:32.601 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:148: TEST_no_update_weight_set: ceph osd getcrushmap 2026-03-08T22:55:32.815 INFO:tasks.workunit.client.0.vm10.stderr:5 2026-03-08T22:55:32.824 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:149: TEST_no_update_weight_set: crushtool -d td/crush-choose-args/map-one-more -o td/crush-choose-args/map-one-more.txt 2026-03-08T22:55:32.840 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:150: TEST_no_update_weight_set: cat td/crush-choose-args/map-one-more.txt 2026-03-08T22:55:32.840 INFO:tasks.workunit.client.0.vm10.stdout:# begin crush map 2026-03-08T22:55:32.840 INFO:tasks.workunit.client.0.vm10.stdout:tunable choose_local_tries 0 2026-03-08T22:55:32.840 INFO:tasks.workunit.client.0.vm10.stdout:tunable choose_local_fallback_tries 0 2026-03-08T22:55:32.840 INFO:tasks.workunit.client.0.vm10.stdout:tunable choose_total_tries 50 2026-03-08T22:55:32.840 INFO:tasks.workunit.client.0.vm10.stdout:tunable chooseleaf_descend_once 1 2026-03-08T22:55:32.840 INFO:tasks.workunit.client.0.vm10.stdout:tunable chooseleaf_vary_r 1 2026-03-08T22:55:32.840 INFO:tasks.workunit.client.0.vm10.stdout:tunable chooseleaf_stable 1 2026-03-08T22:55:32.840 INFO:tasks.workunit.client.0.vm10.stdout:tunable straw_calc_version 1 2026-03-08T22:55:32.840 INFO:tasks.workunit.client.0.vm10.stdout:tunable allowed_bucket_algs 54 2026-03-08T22:55:32.840 INFO:tasks.workunit.client.0.vm10.stdout: 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:# devices 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:device 0 osd.0 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:device 1 osd.1 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:# types 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:type 0 osd 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:type 1 host 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:type 2 chassis 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:type 3 rack 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:type 4 row 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:type 5 pdu 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:type 6 pod 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:type 7 room 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:type 8 datacenter 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:type 9 zone 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:type 10 region 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:type 11 root 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:# buckets 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:host HOST { 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: id -2 # do not change unnecessarily 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: # weight 6.00000 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: alg straw2 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: hash 0 # rjenkins1 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: item osd.0 weight 3.00000 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: item osd.1 weight 3.00000 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:} 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:root default { 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: id -1 # do not change unnecessarily 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: # weight 6.00000 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: alg straw2 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: hash 0 # rjenkins1 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: item HOST weight 6.00000 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:} 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:# rules 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:rule replicated_rule { 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: id 0 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: type replicated 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: step take default 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: step choose firstn 0 type osd 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: step emit 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:} 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:# choose_args 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout:choose_args 0 { 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: { 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: bucket_id -1 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: weight_set [ 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: [ 2.00000 ] 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: [ 1.00000 ] 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: ] 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: ids [ -10 ] 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: } 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: { 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: bucket_id -2 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: weight_set [ 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: [ 2.00000 0.00000 ] 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: [ 1.00000 0.00000 ] 2026-03-08T22:55:32.841 INFO:tasks.workunit.client.0.vm10.stdout: ] 2026-03-08T22:55:32.842 INFO:tasks.workunit.client.0.vm10.stdout: ids [ -20 1 ] 2026-03-08T22:55:32.842 INFO:tasks.workunit.client.0.vm10.stdout: } 2026-03-08T22:55:32.842 INFO:tasks.workunit.client.0.vm10.stdout:} 2026-03-08T22:55:32.842 INFO:tasks.workunit.client.0.vm10.stdout: 2026-03-08T22:55:32.842 INFO:tasks.workunit.client.0.vm10.stdout:# end crush map 2026-03-08T22:55:32.842 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:151: TEST_no_update_weight_set: diff -u td/crush-choose-args/map-one-more.txt /home/ubuntu/cephtest/clone.client.0/src/test/crush/crush-choose-args-expected-one-more-0.txt 2026-03-08T22:55:32.842 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:153: TEST_no_update_weight_set: destroy_osd td/crush-choose-args 1 2026-03-08T22:55:32.842 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:784: destroy_osd: local dir=td/crush-choose-args 2026-03-08T22:55:32.842 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:785: destroy_osd: local id=1 2026-03-08T22:55:32.842 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:787: destroy_osd: ceph osd out osd.1 2026-03-08T22:55:33.135 INFO:tasks.workunit.client.0.vm10.stderr:osd.1 is already out. 2026-03-08T22:55:33.149 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:788: destroy_osd: kill_daemons td/crush-choose-args TERM osd.1 2026-03-08T22:55:33.150 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:55:33.150 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:55:33.150 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:55:33.150 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:55:33.150 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:55:33.457 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:55:33.457 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:789: destroy_osd: ceph osd down osd.1 2026-03-08T22:55:33.700 INFO:tasks.workunit.client.0.vm10.stderr:osd.1 is already down. 2026-03-08T22:55:33.714 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:790: destroy_osd: ceph osd purge osd.1 --yes-i-really-mean-it 2026-03-08T22:55:33.975 INFO:tasks.workunit.client.0.vm10.stderr:osd.1 does not exist 2026-03-08T22:55:33.986 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:791: destroy_osd: teardown td/crush-choose-args/1 2026-03-08T22:55:33.986 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/crush-choose-args/1 2026-03-08T22:55:33.986 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:55:33.986 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/crush-choose-args/1 KILL 2026-03-08T22:55:33.986 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:55:33.986 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:55:33.987 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:55:33.987 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:55:33.987 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:55:33.989 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:55:33.989 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:55:33.990 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:55:33.990 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:55:33.991 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:55:33.991 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:55:33.991 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:55:33.992 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:55:33.992 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:55:33.993 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:55:33.993 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:55:33.993 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:55:33.994 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:55:33.994 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/crush-choose-args/1 2026-03-08T22:55:33.997 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:55:33.997 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:33.997 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:33.997 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50888 2026-03-08T22:55:34.001 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:55:34.001 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:55:34.001 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:792: destroy_osd: rm -fr td/crush-choose-args/1 2026-03-08T22:55:34.002 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:154: TEST_no_update_weight_set: ceph osd crush tree 2026-03-08T22:55:34.239 INFO:tasks.workunit.client.0.vm10.stdout:ID CLASS WEIGHT 0 TYPE NAME 2026-03-08T22:55:34.239 INFO:tasks.workunit.client.0.vm10.stdout:-1 3.00000 root default 2026-03-08T22:55:34.239 INFO:tasks.workunit.client.0.vm10.stdout:-2 3.00000 2.00000 host HOST 2026-03-08T22:55:34.239 INFO:tasks.workunit.client.0.vm10.stdout: 0 3.00000 2.00000 osd.0 2026-03-08T22:55:34.252 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:155: TEST_no_update_weight_set: ceph osd getcrushmap 2026-03-08T22:55:34.472 INFO:tasks.workunit.client.0.vm10.stderr:6 2026-03-08T22:55:34.483 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:156: TEST_no_update_weight_set: crushtool -d td/crush-choose-args/map-one-less -o td/crush-choose-args/map-one-less.txt 2026-03-08T22:55:34.497 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:157: TEST_no_update_weight_set: diff -u td/crush-choose-args/map-one-less.txt td/crush-choose-args/map.txt 2026-03-08T22:55:34.498 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:159: TEST_no_update_weight_set: CEPH_ARGS='--fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false ' 2026-03-08T22:55:34.499 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:41: run: teardown td/crush-choose-args 2026-03-08T22:55:34.499 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/crush-choose-args 2026-03-08T22:55:34.499 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:55:34.499 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/crush-choose-args KILL 2026-03-08T22:55:34.499 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:55:34.499 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:55:34.499 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:55:34.499 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:55:34.499 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:55:34.612 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:55:34.612 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:55:34.613 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:55:34.613 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:55:34.614 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:55:34.614 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:55:34.615 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:55:34.616 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:55:34.616 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:55:34.616 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:55:34.616 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:55:34.616 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:55:34.617 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:55:34.617 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/crush-choose-args 2026-03-08T22:55:34.622 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:55:34.622 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:34.622 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:34.622 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50888 2026-03-08T22:55:34.626 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:55:34.626 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:55:34.626 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:38: run: for func in $funcs 2026-03-08T22:55:34.626 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:39: run: setup td/crush-choose-args 2026-03-08T22:55:34.626 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/crush-choose-args 2026-03-08T22:55:34.626 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/crush-choose-args 2026-03-08T22:55:34.626 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/crush-choose-args 2026-03-08T22:55:34.626 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:55:34.626 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/crush-choose-args KILL 2026-03-08T22:55:34.626 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:55:34.626 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:55:34.626 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:55:34.626 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:55:34.626 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:55:34.628 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:55:34.628 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:55:34.629 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:55:34.629 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:55:34.630 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:55:34.630 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:55:34.630 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:55:34.631 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:55:34.631 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:55:34.631 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:55:34.631 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:55:34.632 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:55:34.633 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:55:34.633 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/crush-choose-args 2026-03-08T22:55:34.634 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:55:34.634 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:34.634 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:34.634 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50888 2026-03-08T22:55:34.635 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:55:34.635 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:55:34.635 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/crush-choose-args 2026-03-08T22:55:34.636 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:55:34.636 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:34.636 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:34.636 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.50888 2026-03-08T22:55:34.638 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:55:34.638 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:55:34.638 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:55:34.638 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/crush-choose-args 1' TERM HUP INT 2026-03-08T22:55:34.638 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:40: run: TEST_reweight td/crush-choose-args 2026-03-08T22:55:34.638 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:164: TEST_reweight: local dir=td/crush-choose-args 2026-03-08T22:55:34.638 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:166: TEST_reweight: ORIG_CEPH_ARGS='--fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false ' 2026-03-08T22:55:34.638 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:167: TEST_reweight: CEPH_ARGS+='--osd-crush-update-weight-set=false ' 2026-03-08T22:55:34.638 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:169: TEST_reweight: run_mon td/crush-choose-args a 2026-03-08T22:55:34.638 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/crush-choose-args 2026-03-08T22:55:34.638 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:55:34.638 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:55:34.638 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:55:34.638 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/crush-choose-args/a 2026-03-08T22:55:34.638 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/crush-choose-args/a --run-dir=td/crush-choose-args 2026-03-08T22:55:34.686 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:55:34.686 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:34.686 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:34.686 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:34.686 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:34.686 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:34.686 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:55:34.686 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/crush-choose-args/a '--log-file=td/crush-choose-args/$name.log' '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --mon-cluster-log-file=td/crush-choose-args/log --run-dir=td/crush-choose-args '--pid-file=td/crush-choose-args/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:55:34.724 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:55:34.724 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:55:34.724 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:55:34.724 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:55:34.724 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:55:34.725 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:55:34.725 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:55:34.725 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:55:34.725 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:55:34.726 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:55:34.727 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:34.727 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:34.727 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50888/ceph-mon.a.asok 2026-03-08T22:55:34.727 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:55:34.727 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50888/ceph-mon.a.asok config get fsid 2026-03-08T22:55:34.785 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:55:34.785 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:55:34.785 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:55:34.785 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:55:34.785 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:55:34.785 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:55:34.785 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:55:34.785 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:55:34.785 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:55:34.785 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:34.785 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:34.786 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.50888/ceph-mon.a.asok 2026-03-08T22:55:34.786 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:55:34.786 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.50888/ceph-mon.a.asok config get mon_host 2026-03-08T22:55:34.846 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:170: TEST_reweight: run_mgr td/crush-choose-args x 2026-03-08T22:55:34.846 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/crush-choose-args 2026-03-08T22:55:34.846 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:55:34.846 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:55:34.846 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:55:34.846 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/crush-choose-args/x 2026-03-08T22:55:34.846 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:55:35.001 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:55:35.001 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:35.001 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:35.002 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:35.002 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:35.002 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:35.002 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:55:35.002 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:55:35.004 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/crush-choose-args/x '--log-file=td/crush-choose-args/$name.log' '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --run-dir=td/crush-choose-args '--pid-file=td/crush-choose-args/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:171: TEST_reweight: run_osd td/crush-choose-args 0 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-choose-args 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-choose-args/0 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false ' 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-choose-args/0' 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-choose-args/0/journal' 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-choose-args' 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:35.038 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:35.040 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:55:35.040 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:55:35.040 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:55:35.040 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:55:35.040 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:55:35.040 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-choose-args/$name.log' 2026-03-08T22:55:35.040 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-choose-args/$name.pid' 2026-03-08T22:55:35.040 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:55:35.040 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:55:35.040 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:55:35.040 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:55:35.040 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:55:35.040 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:55:35.040 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-choose-args/0 2026-03-08T22:55:35.042 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:55:35.043 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=39ce4309-3248-4b68-a5f3-c31092985694 2026-03-08T22:55:35.043 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 39ce4309-3248-4b68-a5f3-c31092985694' 2026-03-08T22:55:35.043 INFO:tasks.workunit.client.0.vm10.stdout:add osd0 39ce4309-3248-4b68-a5f3-c31092985694 2026-03-08T22:55:35.043 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:55:35.060 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBn/q1p/CmWAxAACLxCYOkjW6f8rLEAC23Y7w== 2026-03-08T22:55:35.061 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBn/q1p/CmWAxAACLxCYOkjW6f8rLEAC23Y7w=="}' 2026-03-08T22:55:35.061 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 39ce4309-3248-4b68-a5f3-c31092985694 -i td/crush-choose-args/0/new.json 2026-03-08T22:55:35.200 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:55:35.211 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-choose-args/0/new.json 2026-03-08T22:55:35.212 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/0 --osd-journal=td/crush-choose-args/0/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBn/q1p/CmWAxAACLxCYOkjW6f8rLEAC23Y7w== --osd-uuid 39ce4309-3248-4b68-a5f3-c31092985694 2026-03-08T22:55:35.237 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:35.236+0000 7f87d6c82780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:35.238 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:35.239+0000 7f87d6c82780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:35.241 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:35.241+0000 7f87d6c82780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:35.241 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:35.242+0000 7f87d6c82780 -1 bdev(0x55c831466800 td/crush-choose-args/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:55:35.241 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:35.242+0000 7f87d6c82780 -1 bluestore(td/crush-choose-args/0) _read_fsid unparsable uuid 2026-03-08T22:55:37.342 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-choose-args/0/keyring 2026-03-08T22:55:37.342 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:55:37.343 INFO:tasks.workunit.client.0.vm10.stdout:adding osd0 key to auth repository 2026-03-08T22:55:37.343 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:55:37.343 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-choose-args/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:55:37.476 INFO:tasks.workunit.client.0.vm10.stdout:start osd.0 2026-03-08T22:55:37.476 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:55:37.476 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:55:37.476 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:55:37.479 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:55:37.481 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/0 --osd-journal=td/crush-choose-args/0/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:55:37.525 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:37.526+0000 7f19f1364780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:37.533 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:37.533+0000 7f19f1364780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:37.534 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:37.535+0000 7f19f1364780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:37.652 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:55:37.652 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:55:37.652 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:55:37.653 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:55:37.653 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:55:37.653 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:55:37.653 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:37.653 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:55:37.653 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:37.653 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:55:37.879 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:38.597 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:38.598+0000 7f19f1364780 -1 Falling back to public interface 2026-03-08T22:55:38.880 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:38.881 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:38.881 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:55:38.881 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:55:38.882 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:38.882 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:55:39.107 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:39.462 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:39.462+0000 7f19f1364780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:55:40.109 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:40.109 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:40.109 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:55:40.109 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:55:40.110 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:40.110 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:55:40.366 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:40.449 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:40.449+0000 7f19ed304640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:55:41.368 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:55:41.368 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:41.368 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:41.368 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:55:41.368 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:41.368 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:55:41.595 INFO:tasks.workunit.client.0.vm10.stdout:osd.0 up in weight 1 up_from 4 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/953864139,v1:127.0.0.1:6803/953864139] [v2:127.0.0.1:6804/953864139,v1:127.0.0.1:6805/953864139] exists,up 39ce4309-3248-4b68-a5f3-c31092985694 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:172: TEST_reweight: run_osd td/crush-choose-args 1 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-choose-args 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-choose-args/1 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false ' 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-choose-args/1' 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-choose-args/1/journal' 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-choose-args' 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:41.596 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:41.597 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:41.597 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:55:41.597 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:55:41.597 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:55:41.597 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:55:41.597 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:55:41.597 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-choose-args/$name.log' 2026-03-08T22:55:41.597 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-choose-args/$name.pid' 2026-03-08T22:55:41.597 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:55:41.597 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:55:41.597 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:55:41.597 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:55:41.597 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:55:41.597 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:55:41.597 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-choose-args/1 2026-03-08T22:55:41.599 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:55:41.600 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=0a1c6553-6247-4ad6-acb9-97e4ddb7098d 2026-03-08T22:55:41.600 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 0a1c6553-6247-4ad6-acb9-97e4ddb7098d' 2026-03-08T22:55:41.600 INFO:tasks.workunit.client.0.vm10.stdout:add osd1 0a1c6553-6247-4ad6-acb9-97e4ddb7098d 2026-03-08T22:55:41.600 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:55:41.616 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBt/q1pjjm1JBAAYIT/7H06Mm4NbTsKoCU7RA== 2026-03-08T22:55:41.616 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBt/q1pjjm1JBAAYIT/7H06Mm4NbTsKoCU7RA=="}' 2026-03-08T22:55:41.616 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 0a1c6553-6247-4ad6-acb9-97e4ddb7098d -i td/crush-choose-args/1/new.json 2026-03-08T22:55:41.866 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:55:41.877 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-choose-args/1/new.json 2026-03-08T22:55:41.878 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/1 --osd-journal=td/crush-choose-args/1/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBt/q1pjjm1JBAAYIT/7H06Mm4NbTsKoCU7RA== --osd-uuid 0a1c6553-6247-4ad6-acb9-97e4ddb7098d 2026-03-08T22:55:41.901 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:41.901+0000 7f319ae1e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:41.903 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:41.903+0000 7f319ae1e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:41.904 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:41.904+0000 7f319ae1e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:41.904 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:41.905+0000 7f319ae1e780 -1 bdev(0x5645b2535c00 td/crush-choose-args/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:55:41.904 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:41.905+0000 7f319ae1e780 -1 bluestore(td/crush-choose-args/1) _read_fsid unparsable uuid 2026-03-08T22:55:44.042 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-choose-args/1/keyring 2026-03-08T22:55:44.042 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:55:44.043 INFO:tasks.workunit.client.0.vm10.stdout:adding osd1 key to auth repository 2026-03-08T22:55:44.043 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:55:44.043 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-choose-args/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:55:44.339 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:55:44.339 INFO:tasks.workunit.client.0.vm10.stdout:start osd.1 2026-03-08T22:55:44.339 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/1 --osd-journal=td/crush-choose-args/1/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:55:44.339 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:55:44.340 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:55:44.342 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:55:44.359 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:44.359+0000 7f0171dd4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:44.369 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:44.370+0000 7f0171dd4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:44.371 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:44.371+0000 7f0171dd4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:44.583 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:55:44.583 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:55:44.583 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:55:44.583 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:55:44.583 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:55:44.583 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:55:44.583 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:44.583 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:55:44.583 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:44.584 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:44.823 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:45.193 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:45.193+0000 7f0171dd4780 -1 Falling back to public interface 2026-03-08T22:55:45.825 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:55:45.825 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:45.825 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:45.825 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:55:45.825 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:45.825 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:46.060 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:46.322 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:46.322+0000 7f0171dd4780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:55:47.063 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:55:47.063 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:47.063 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:47.063 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:55:47.063 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:47.063 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:47.315 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:48.316 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:55:48.316 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:48.317 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:48.317 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:55:48.318 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:48.318 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:48.546 INFO:tasks.workunit.client.0.vm10.stdout:osd.1 up in weight 1 up_from 8 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/499212404,v1:127.0.0.1:6811/499212404] [v2:127.0.0.1:6812/499212404,v1:127.0.0.1:6813/499212404] exists,up 0a1c6553-6247-4ad6-acb9-97e4ddb7098d 2026-03-08T22:55:48.546 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:55:48.546 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:55:48.546 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:55:48.546 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:174: TEST_reweight: ceph osd crush weight-set create-compat 2026-03-08T22:55:48.824 INFO:tasks.workunit.client.0.vm10.stderr:compat weight-set already created 2026-03-08T22:55:48.836 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:175: TEST_reweight: ceph osd crush tree 2026-03-08T22:55:49.062 INFO:tasks.workunit.client.0.vm10.stdout:ID CLASS WEIGHT (compat) TYPE NAME 2026-03-08T22:55:49.062 INFO:tasks.workunit.client.0.vm10.stdout:-1 6.00000 root default 2026-03-08T22:55:49.062 INFO:tasks.workunit.client.0.vm10.stdout:-2 6.00000 6.00000 host HOST 2026-03-08T22:55:49.062 INFO:tasks.workunit.client.0.vm10.stdout: 0 3.00000 3.00000 osd.0 2026-03-08T22:55:49.062 INFO:tasks.workunit.client.0.vm10.stdout: 1 3.00000 3.00000 osd.1 2026-03-08T22:55:49.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:177: TEST_reweight: ceph osd crush weight-set reweight-compat osd.0 2 2026-03-08T22:55:49.368 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:178: TEST_reweight: ceph osd crush tree 2026-03-08T22:55:49.598 INFO:tasks.workunit.client.0.vm10.stdout:ID CLASS WEIGHT (compat) TYPE NAME 2026-03-08T22:55:49.598 INFO:tasks.workunit.client.0.vm10.stdout:-1 6.00000 root default 2026-03-08T22:55:49.598 INFO:tasks.workunit.client.0.vm10.stdout:-2 6.00000 5.00000 host HOST 2026-03-08T22:55:49.598 INFO:tasks.workunit.client.0.vm10.stdout: 0 3.00000 2.00000 osd.0 2026-03-08T22:55:49.598 INFO:tasks.workunit.client.0.vm10.stdout: 1 3.00000 3.00000 osd.1 2026-03-08T22:55:49.610 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:179: TEST_reweight: ceph osd crush tree 2026-03-08T22:55:49.611 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:179: TEST_reweight: grep '6.00000 5.00000' 2026-03-08T22:55:49.612 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:179: TEST_reweight: grep host 2026-03-08T22:55:49.835 INFO:tasks.workunit.client.0.vm10.stdout:-2 6.00000 5.00000 host HOST 2026-03-08T22:55:49.836 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:181: TEST_reweight: run_osd td/crush-choose-args 2 2026-03-08T22:55:49.836 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-choose-args 2026-03-08T22:55:49.836 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:55:49.836 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:55:49.836 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:55:49.836 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-choose-args/2 2026-03-08T22:55:49.836 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false ' 2026-03-08T22:55:49.836 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:55:49.836 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:55:49.836 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:55:49.836 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-choose-args/2' 2026-03-08T22:55:49.836 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-choose-args/2/journal' 2026-03-08T22:55:49.836 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:55:49.836 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:55:49.836 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-choose-args' 2026-03-08T22:55:49.836 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:55:49.836 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:49.836 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:49.837 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:49.837 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:49.837 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:49.837 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:55:49.838 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' 2026-03-08T22:55:49.838 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:55:49.838 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:55:49.838 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:55:49.838 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-choose-args/$name.log' 2026-03-08T22:55:49.838 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-choose-args/$name.pid' 2026-03-08T22:55:49.838 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:55:49.838 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:55:49.838 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:55:49.838 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:55:49.838 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:55:49.838 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:55:49.838 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-choose-args/2 2026-03-08T22:55:49.839 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:55:49.840 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=ce6fef8c-d73e-4096-a966-a85649585724 2026-03-08T22:55:49.840 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 ce6fef8c-d73e-4096-a966-a85649585724' 2026-03-08T22:55:49.840 INFO:tasks.workunit.client.0.vm10.stdout:add osd2 ce6fef8c-d73e-4096-a966-a85649585724 2026-03-08T22:55:49.840 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:55:49.855 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQB1/q1pOub0MhAAimISD4eKX32DOs0nbdWAMA== 2026-03-08T22:55:49.855 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQB1/q1pOub0MhAAimISD4eKX32DOs0nbdWAMA=="}' 2026-03-08T22:55:49.855 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new ce6fef8c-d73e-4096-a966-a85649585724 -i td/crush-choose-args/2/new.json 2026-03-08T22:55:50.084 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:55:50.094 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-choose-args/2/new.json 2026-03-08T22:55:50.095 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/2 --osd-journal=td/crush-choose-args/2/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQB1/q1pOub0MhAAimISD4eKX32DOs0nbdWAMA== --osd-uuid ce6fef8c-d73e-4096-a966-a85649585724 2026-03-08T22:55:50.116 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:50.117+0000 7fb4d7964780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:50.118 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:50.119+0000 7fb4d7964780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:50.120 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:50.120+0000 7fb4d7964780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:50.120 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:50.120+0000 7fb4d7964780 -1 bdev(0x561ac9187c00 td/crush-choose-args/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:55:50.120 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:50.120+0000 7fb4d7964780 -1 bluestore(td/crush-choose-args/2) _read_fsid unparsable uuid 2026-03-08T22:55:52.522 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-choose-args/2/keyring 2026-03-08T22:55:52.522 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:55:52.522 INFO:tasks.workunit.client.0.vm10.stdout:adding osd2 key to auth repository 2026-03-08T22:55:52.523 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:55:52.523 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-choose-args/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:55:52.829 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:55:52.829 INFO:tasks.workunit.client.0.vm10.stdout:start osd.2 2026-03-08T22:55:52.829 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=e75c0fba-fe75-499b-ac80-a647f88b5578 --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/2 --osd-journal=td/crush-choose-args/2/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.50888/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:55:52.829 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:55:52.830 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:55:52.833 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:55:52.850 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:52.850+0000 7faeb90a4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:52.858 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:52.859+0000 7faeb90a4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:52.861 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:52.860+0000 7faeb90a4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:53.067 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:55:53.067 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:55:53.067 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:55:53.067 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:55:53.068 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:55:53.068 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:55:53.068 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:53.068 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:55:53.068 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:53.068 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:55:53.299 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:53.420 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:53.421+0000 7faeb90a4780 -1 Falling back to public interface 2026-03-08T22:55:54.301 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:54.301 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:54.301 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:55:54.301 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:55:54.302 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:54.302 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:55:54.366 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:54.363+0000 7faeb90a4780 -1 osd.2 0 log_to_monitors true 2026-03-08T22:55:54.539 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:55.541 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:55:55.541 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:55.541 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:55.541 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:55:55.541 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:55.541 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:55:55.801 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:55.866 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:55.866+0000 7faeb5044640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T22:55:56.803 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:55:56.803 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:56.803 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:56.803 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:55:56.803 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:56.803 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:55:57.050 INFO:tasks.workunit.client.0.vm10.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/1197366878,v1:127.0.0.1:6819/1197366878] [v2:127.0.0.1:6820/1197366878,v1:127.0.0.1:6821/1197366878] exists,up ce6fef8c-d73e-4096-a966-a85649585724 2026-03-08T22:55:57.050 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:55:57.050 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:55:57.050 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:55:57.051 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:182: TEST_reweight: ceph osd crush tree 2026-03-08T22:55:57.355 INFO:tasks.workunit.client.0.vm10.stdout:ID CLASS WEIGHT (compat) TYPE NAME 2026-03-08T22:55:57.355 INFO:tasks.workunit.client.0.vm10.stdout:-1 9.00000 root default 2026-03-08T22:55:57.355 INFO:tasks.workunit.client.0.vm10.stdout:-2 9.00000 5.00000 host HOST 2026-03-08T22:55:57.355 INFO:tasks.workunit.client.0.vm10.stdout: 0 3.00000 2.00000 osd.0 2026-03-08T22:55:57.355 INFO:tasks.workunit.client.0.vm10.stdout: 1 3.00000 3.00000 osd.1 2026-03-08T22:55:57.356 INFO:tasks.workunit.client.0.vm10.stdout: 2 3.00000 0 osd.2 2026-03-08T22:55:57.366 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:183: TEST_reweight: ceph osd crush tree 2026-03-08T22:55:57.366 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:183: TEST_reweight: grep '9.00000 5.00000' 2026-03-08T22:55:57.367 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:183: TEST_reweight: grep host 2026-03-08T22:55:57.625 INFO:tasks.workunit.client.0.vm10.stdout:-2 9.00000 5.00000 host HOST 2026-03-08T22:55:57.625 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:185: TEST_reweight: ceph osd crush reweight osd.2 4 2026-03-08T22:55:57.912 INFO:tasks.workunit.client.0.vm10.stderr:reweighted item id 2 name 'osd.2' to 4 in crush map 2026-03-08T22:55:57.926 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:186: TEST_reweight: ceph osd crush tree 2026-03-08T22:55:58.148 INFO:tasks.workunit.client.0.vm10.stdout:ID CLASS WEIGHT (compat) TYPE NAME 2026-03-08T22:55:58.148 INFO:tasks.workunit.client.0.vm10.stdout:-1 10.00000 root default 2026-03-08T22:55:58.148 INFO:tasks.workunit.client.0.vm10.stdout:-2 10.00000 5.00000 host HOST 2026-03-08T22:55:58.148 INFO:tasks.workunit.client.0.vm10.stdout: 0 3.00000 2.00000 osd.0 2026-03-08T22:55:58.148 INFO:tasks.workunit.client.0.vm10.stdout: 1 3.00000 3.00000 osd.1 2026-03-08T22:55:58.148 INFO:tasks.workunit.client.0.vm10.stdout: 2 4.00000 0 osd.2 2026-03-08T22:55:58.158 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:187: TEST_reweight: ceph osd crush tree 2026-03-08T22:55:58.158 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:187: TEST_reweight: grep '10.00000 5.00000' 2026-03-08T22:55:58.159 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:187: TEST_reweight: grep host 2026-03-08T22:55:58.400 INFO:tasks.workunit.client.0.vm10.stdout:-2 10.00000 5.00000 host HOST 2026-03-08T22:55:58.400 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:189: TEST_reweight: ceph osd crush weight-set reweight-compat osd.2 4 2026-03-08T22:55:58.699 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:190: TEST_reweight: ceph osd crush tree 2026-03-08T22:55:58.920 INFO:tasks.workunit.client.0.vm10.stdout:ID CLASS WEIGHT (compat) TYPE NAME 2026-03-08T22:55:58.920 INFO:tasks.workunit.client.0.vm10.stdout:-1 10.00000 root default 2026-03-08T22:55:58.920 INFO:tasks.workunit.client.0.vm10.stdout:-2 10.00000 9.00000 host HOST 2026-03-08T22:55:58.920 INFO:tasks.workunit.client.0.vm10.stdout: 0 3.00000 2.00000 osd.0 2026-03-08T22:55:58.920 INFO:tasks.workunit.client.0.vm10.stdout: 1 3.00000 3.00000 osd.1 2026-03-08T22:55:58.920 INFO:tasks.workunit.client.0.vm10.stdout: 2 4.00000 4.00000 osd.2 2026-03-08T22:55:58.930 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:191: TEST_reweight: ceph osd crush tree 2026-03-08T22:55:58.931 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:191: TEST_reweight: grep host 2026-03-08T22:55:58.931 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:191: TEST_reweight: grep '10.00000 9.00000' 2026-03-08T22:55:59.188 INFO:tasks.workunit.client.0.vm10.stdout:-2 10.00000 9.00000 host HOST 2026-03-08T22:55:59.188 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:41: run: teardown td/crush-choose-args 2026-03-08T22:55:59.188 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/crush-choose-args 2026-03-08T22:55:59.188 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:55:59.188 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/crush-choose-args KILL 2026-03-08T22:55:59.188 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:55:59.188 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:55:59.189 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:55:59.189 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:55:59.189 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:55:59.320 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:55:59.320 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:55:59.322 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:55:59.322 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:55:59.323 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:55:59.323 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:55:59.323 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:55:59.325 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:55:59.325 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:55:59.325 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:55:59.325 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:55:59.326 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:55:59.328 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:55:59.328 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/crush-choose-args 2026-03-08T22:55:59.341 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:55:59.342 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:59.342 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:59.342 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50888 2026-03-08T22:55:59.343 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:55:59.343 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:55:59.343 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2377: main: code=0 2026-03-08T22:55:59.343 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2381: main: teardown td/crush-choose-args 0 2026-03-08T22:55:59.343 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/crush-choose-args 2026-03-08T22:55:59.344 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs=0 2026-03-08T22:55:59.344 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/crush-choose-args KILL 2026-03-08T22:55:59.344 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:55:59.344 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:55:59.344 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:55:59.344 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:55:59.344 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:55:59.346 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:55:59.346 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:55:59.347 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:55:59.347 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:55:59.349 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:55:59.349 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:55:59.349 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:55:59.350 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:55:59.350 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:55:59.350 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:55:59.350 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:55:59.351 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:55:59.352 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o 0 = 1 ']' 2026-03-08T22:55:59.352 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/crush-choose-args 2026-03-08T22:55:59.354 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:55:59.354 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:59.354 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.50888 2026-03-08T22:55:59.354 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.50888 2026-03-08T22:55:59.355 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:55:59.355 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:55:59.355 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2382: main: return 0 2026-03-08T22:55:59.355 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-08T22:55:59.356 DEBUG:teuthology.orchestra.run.vm10:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-08T22:55:59.423 INFO:tasks.workunit:Running workunit crush/crush-classes.sh... 2026-03-08T22:55:59.423 DEBUG:teuthology.orchestra.run.vm10:workunit test crush/crush-classes.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh 2026-03-08T22:55:59.485 INFO:tasks.workunit.client.0.vm10.stderr:stty: 'standard input': Inappropriate ioctl for device 2026-03-08T22:55:59.489 INFO:tasks.workunit.client.0.vm10.stderr:+ PS4='${BASH_SOURCE[0]}:$LINENO: ${FUNCNAME[0]}: ' 2026-03-08T22:55:59.489 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: export PATH=.:/home/ubuntu/.local/bin:/home/ubuntu/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/sbin 2026-03-08T22:55:59.489 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: PATH=.:/home/ubuntu/.local/bin:/home/ubuntu/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/sbin 2026-03-08T22:55:59.489 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: export PYTHONWARNINGS=ignore 2026-03-08T22:55:59.489 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: PYTHONWARNINGS=ignore 2026-03-08T22:55:59.489 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: export CEPH_CONF=/dev/null 2026-03-08T22:55:59.489 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: CEPH_CONF=/dev/null 2026-03-08T22:55:59.489 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2373: main: unset CEPH_ARGS 2026-03-08T22:55:59.489 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2375: main: local code 2026-03-08T22:55:59.489 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2376: main: run td/crush-classes 2026-03-08T22:55:59.489 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:21: run: local dir=td/crush-classes 2026-03-08T22:55:59.489 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:22: run: shift 2026-03-08T22:55:59.489 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:24: run: export CEPH_MON=127.0.0.1:7130 2026-03-08T22:55:59.489 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:24: run: CEPH_MON=127.0.0.1:7130 2026-03-08T22:55:59.489 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:25: run: export CEPH_ARGS 2026-03-08T22:55:59.489 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:26: run: uuidgen 2026-03-08T22:55:59.489 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:26: run: CEPH_ARGS+='--fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none ' 2026-03-08T22:55:59.489 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:27: run: CEPH_ARGS+='--mon-host=127.0.0.1:7130 ' 2026-03-08T22:55:59.489 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:31: run: CEPH_ARGS+='--osd-class-update-on-start=false ' 2026-03-08T22:55:59.490 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:33: run: set 2026-03-08T22:55:59.490 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:33: run: sed -n -e 's/^\(TEST_[0-9a-z_]*\) .*/\1/p' 2026-03-08T22:55:59.492 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:33: run: local 'funcs=TEST_classes 2026-03-08T22:55:59.492 INFO:tasks.workunit.client.0.vm10.stderr:TEST_mon_classes 2026-03-08T22:55:59.492 INFO:tasks.workunit.client.0.vm10.stderr:TEST_reweight_vs_classes 2026-03-08T22:55:59.492 INFO:tasks.workunit.client.0.vm10.stderr:TEST_set_device_class' 2026-03-08T22:55:59.492 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:34: run: for func in $funcs 2026-03-08T22:55:59.492 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:35: run: setup td/crush-classes 2026-03-08T22:55:59.492 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/crush-classes 2026-03-08T22:55:59.492 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/crush-classes 2026-03-08T22:55:59.492 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/crush-classes 2026-03-08T22:55:59.492 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:55:59.492 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/crush-classes KILL 2026-03-08T22:55:59.492 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:55:59.493 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:55:59.493 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:55:59.493 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:55:59.493 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:55:59.495 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:55:59.495 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:55:59.496 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:55:59.496 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:55:59.497 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:55:59.497 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:55:59.497 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:55:59.498 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:55:59.498 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:55:59.498 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:55:59.498 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:55:59.499 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:55:59.500 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:55:59.500 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/crush-classes 2026-03-08T22:55:59.501 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:55:59.501 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:59.501 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:55:59.501 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.66179 2026-03-08T22:55:59.502 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:55:59.502 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:55:59.502 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/crush-classes 2026-03-08T22:55:59.504 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:55:59.504 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:59.504 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:55:59.504 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.66179 2026-03-08T22:55:59.505 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:55:59.505 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 1024 -le 1024 ']' 2026-03-08T22:55:59.505 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:136: setup: ulimit -n 4096 2026-03-08T22:55:59.505 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:55:59.505 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/crush-classes 1' TERM HUP INT 2026-03-08T22:55:59.505 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:36: run: TEST_classes td/crush-classes 2026-03-08T22:55:59.506 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:89: TEST_classes: local dir=td/crush-classes 2026-03-08T22:55:59.506 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:91: TEST_classes: run_mon td/crush-classes a 2026-03-08T22:55:59.506 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/crush-classes 2026-03-08T22:55:59.506 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:55:59.506 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:55:59.506 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:55:59.506 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/crush-classes/a 2026-03-08T22:55:59.506 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/crush-classes/a --run-dir=td/crush-classes 2026-03-08T22:55:59.531 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:55:59.531 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:59.531 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:59.531 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:59.531 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:59.531 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:55:59.531 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:55:59.532 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/crush-classes/a '--log-file=td/crush-classes/$name.log' '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --mon-cluster-log-file=td/crush-classes/log --run-dir=td/crush-classes '--pid-file=td/crush-classes/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:55:59.563 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:55:59.565 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:55:59.565 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:55:59.565 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:55:59.565 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:55:59.565 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:55:59.566 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:55:59.566 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:55:59.566 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:55:59.567 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:55:59.567 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:59.567 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:55:59.567 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.66179/ceph-mon.a.asok 2026-03-08T22:55:59.568 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:55:59.568 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.66179/ceph-mon.a.asok config get fsid 2026-03-08T22:55:59.621 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:55:59.621 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:55:59.621 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:55:59.621 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:55:59.621 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:55:59.621 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:55:59.621 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:55:59.621 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:55:59.621 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:55:59.621 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:59.621 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:55:59.621 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.66179/ceph-mon.a.asok 2026-03-08T22:55:59.621 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:55:59.621 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.66179/ceph-mon.a.asok config get mon_host 2026-03-08T22:55:59.677 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:92: TEST_classes: run_osd td/crush-classes 0 2026-03-08T22:55:59.677 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-classes 2026-03-08T22:55:59.677 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:55:59.677 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:55:59.677 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:55:59.677 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-classes/0 2026-03-08T22:55:59.677 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-08T22:55:59.677 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:55:59.677 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:55:59.677 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:55:59.677 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-classes/0' 2026-03-08T22:55:59.677 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-classes/0/journal' 2026-03-08T22:55:59.677 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:55:59.677 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:55:59.677 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-08T22:55:59.677 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:55:59.677 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:59.677 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:59.678 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:59.678 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:59.678 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:55:59.678 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:55:59.678 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:55:59.678 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:55:59.678 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:55:59.678 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:55:59.678 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-08T22:55:59.678 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-08T22:55:59.678 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:55:59.678 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:55:59.678 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:55:59.678 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:55:59.678 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:55:59.678 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:55:59.678 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-classes/0 2026-03-08T22:55:59.679 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:55:59.680 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=4076f23b-83a3-4194-a540-41fddf39f0dd 2026-03-08T22:55:59.680 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 4076f23b-83a3-4194-a540-41fddf39f0dd' 2026-03-08T22:55:59.680 INFO:tasks.workunit.client.0.vm10.stdout:add osd0 4076f23b-83a3-4194-a540-41fddf39f0dd 2026-03-08T22:55:59.681 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:55:59.695 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQB//q1pEstsKRAAf8p0TdCARj3NK14SFKc+mA== 2026-03-08T22:55:59.695 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQB//q1pEstsKRAAf8p0TdCARj3NK14SFKc+mA=="}' 2026-03-08T22:55:59.695 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 4076f23b-83a3-4194-a540-41fddf39f0dd -i td/crush-classes/0/new.json 2026-03-08T22:55:59.821 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:55:59.830 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-classes/0/new.json 2026-03-08T22:55:59.831 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/0 --osd-journal=td/crush-classes/0/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQB//q1pEstsKRAAf8p0TdCARj3NK14SFKc+mA== --osd-uuid 4076f23b-83a3-4194-a540-41fddf39f0dd 2026-03-08T22:55:59.853 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:59.854+0000 7f6e0fc20780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:59.855 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:59.856+0000 7f6e0fc20780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:59.856 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:59.857+0000 7f6e0fc20780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:59.857 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:59.857+0000 7f6e0fc20780 -1 bdev(0x564effd4fc00 td/crush-classes/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:55:59.857 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:55:59.857+0000 7f6e0fc20780 -1 bluestore(td/crush-classes/0) _read_fsid unparsable uuid 2026-03-08T22:56:02.035 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-classes/0/keyring 2026-03-08T22:56:02.035 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:56:02.035 INFO:tasks.workunit.client.0.vm10.stdout:adding osd0 key to auth repository 2026-03-08T22:56:02.035 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:56:02.035 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-classes/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:56:02.158 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:56:02.158 INFO:tasks.workunit.client.0.vm10.stdout:start osd.0 2026-03-08T22:56:02.158 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/0 --osd-journal=td/crush-classes/0/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:56:02.158 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:56:02.159 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:56:02.161 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:56:02.178 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:02.178+0000 7fdd9a82c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:02.185 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:02.186+0000 7fdd9a82c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:02.187 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:02.187+0000 7fdd9a82c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:02.279 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:56:02.279 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:56:02.279 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:56:02.279 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:56:02.279 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:56:02.279 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:02.279 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:56:02.279 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:56:02.279 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:02.279 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:56:02.396 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:03.285 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:03.285+0000 7fdd9a82c780 -1 Falling back to public interface 2026-03-08T22:56:03.397 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:03.397 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:03.397 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:56:03.397 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:56:03.398 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:03.398 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:56:03.517 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:04.168 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:04.169+0000 7fdd9a82c780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:56:04.520 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:56:04.520 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:04.520 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:04.520 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:56:04.520 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:04.520 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:56:04.642 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:05.644 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:56:05.644 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:05.644 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:05.644 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:56:05.644 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:05.644 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:56:05.756 INFO:tasks.workunit.client.0.vm10.stdout:osd.0 up in weight 1 up_from 4 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6800/430405823,v1:127.0.0.1:6801/430405823] [v2:127.0.0.1:6802/430405823,v1:127.0.0.1:6803/430405823] exists,up 4076f23b-83a3-4194-a540-41fddf39f0dd 2026-03-08T22:56:05.756 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:56:05.756 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:56:05.756 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:56:05.756 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:93: TEST_classes: run_osd td/crush-classes 1 2026-03-08T22:56:05.756 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-classes 2026-03-08T22:56:05.756 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:56:05.756 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:56:05.756 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:56:05.756 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-classes/1 2026-03-08T22:56:05.756 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-08T22:56:05.756 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:56:05.756 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:56:05.756 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:56:05.756 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-classes/1' 2026-03-08T22:56:05.756 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-classes/1/journal' 2026-03-08T22:56:05.756 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:56:05.756 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:56:05.756 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-08T22:56:05.757 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:56:05.757 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:05.757 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:05.757 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:05.757 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:05.757 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:56:05.757 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:56:05.757 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:56:05.757 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:56:05.757 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:56:05.757 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:56:05.757 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-08T22:56:05.757 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-08T22:56:05.757 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:56:05.757 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:56:05.757 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:56:05.757 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:56:05.757 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:56:05.757 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:56:05.758 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-classes/1 2026-03-08T22:56:05.759 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:56:05.759 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=8438b52a-12b3-4df6-96dd-70c6e2392b72 2026-03-08T22:56:05.759 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 8438b52a-12b3-4df6-96dd-70c6e2392b72' 2026-03-08T22:56:05.759 INFO:tasks.workunit.client.0.vm10.stdout:add osd1 8438b52a-12b3-4df6-96dd-70c6e2392b72 2026-03-08T22:56:05.760 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:56:05.772 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCF/q1ptlAOLhAACgq5KY1dGxcIMHldZvd1NA== 2026-03-08T22:56:05.772 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCF/q1ptlAOLhAACgq5KY1dGxcIMHldZvd1NA=="}' 2026-03-08T22:56:05.772 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 8438b52a-12b3-4df6-96dd-70c6e2392b72 -i td/crush-classes/1/new.json 2026-03-08T22:56:05.895 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:56:05.903 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-classes/1/new.json 2026-03-08T22:56:05.903 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/1 --osd-journal=td/crush-classes/1/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCF/q1ptlAOLhAACgq5KY1dGxcIMHldZvd1NA== --osd-uuid 8438b52a-12b3-4df6-96dd-70c6e2392b72 2026-03-08T22:56:05.923 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:05.923+0000 7f063f41c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:05.924 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:05.925+0000 7f063f41c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:05.925 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:05.926+0000 7f063f41c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:05.926 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:05.926+0000 7f063f41c780 -1 bdev(0x561bacd65c00 td/crush-classes/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:56:05.926 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:05.926+0000 7f063f41c780 -1 bluestore(td/crush-classes/1) _read_fsid unparsable uuid 2026-03-08T22:56:08.826 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-classes/1/keyring 2026-03-08T22:56:08.826 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:56:08.827 INFO:tasks.workunit.client.0.vm10.stdout:adding osd1 key to auth repository 2026-03-08T22:56:08.827 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:56:08.827 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-classes/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:56:08.959 INFO:tasks.workunit.client.0.vm10.stdout:start osd.1 2026-03-08T22:56:08.959 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:56:08.959 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/1 --osd-journal=td/crush-classes/1/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:56:08.959 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:56:08.960 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:56:08.962 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:56:08.980 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:08.979+0000 7f3e27021780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:08.980 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:08.981+0000 7f3e27021780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:08.982 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:08.982+0000 7f3e27021780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:09.088 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:56:09.089 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:56:09.089 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:56:09.089 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:56:09.089 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:56:09.089 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:09.089 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:56:09.089 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:56:09.089 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:09.089 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:09.203 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:09.810 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:09.810+0000 7f3e27021780 -1 Falling back to public interface 2026-03-08T22:56:10.204 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:10.204 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:10.204 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:56:10.204 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:56:10.205 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:10.205 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:10.317 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:10.670 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:10.670+0000 7f3e27021780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:56:11.318 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:56:11.318 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:11.318 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:11.318 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:56:11.320 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:11.320 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:11.473 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:11.672 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:11.672+0000 7f3e22a19640 -1 osd.1 0 waiting for initial osdmap 2026-03-08T22:56:12.475 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:56:12.475 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:12.475 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:12.475 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:56:12.475 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:12.475 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:12.584 INFO:tasks.workunit.client.0.vm10.stdout:osd.1 up in weight 1 up_from 8 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6808/1901504577,v1:127.0.0.1:6809/1901504577] [v2:127.0.0.1:6810/1901504577,v1:127.0.0.1:6811/1901504577] exists,up 8438b52a-12b3-4df6-96dd-70c6e2392b72 2026-03-08T22:56:12.584 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:56:12.584 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:56:12.584 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:56:12.584 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:94: TEST_classes: run_osd td/crush-classes 2 2026-03-08T22:56:12.584 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-classes 2026-03-08T22:56:12.584 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:56:12.584 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:56:12.584 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:56:12.584 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-classes/2 2026-03-08T22:56:12.584 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-08T22:56:12.584 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:56:12.585 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:56:12.585 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:56:12.585 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-classes/2' 2026-03-08T22:56:12.585 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-classes/2/journal' 2026-03-08T22:56:12.585 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:56:12.585 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:56:12.585 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-08T22:56:12.585 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:56:12.585 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:12.585 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:12.585 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:12.585 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:12.585 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:56:12.586 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:56:12.586 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:56:12.586 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:56:12.586 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:56:12.586 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:56:12.586 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-08T22:56:12.586 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-08T22:56:12.586 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:56:12.586 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:56:12.586 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:56:12.586 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:56:12.586 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:56:12.586 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:56:12.586 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-classes/2 2026-03-08T22:56:12.587 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:56:12.588 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=7cbb2111-156d-48ba-816d-e481e0fc9313 2026-03-08T22:56:12.588 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 7cbb2111-156d-48ba-816d-e481e0fc9313' 2026-03-08T22:56:12.588 INFO:tasks.workunit.client.0.vm10.stdout:add osd2 7cbb2111-156d-48ba-816d-e481e0fc9313 2026-03-08T22:56:12.589 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:56:12.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCM/q1ptofiIxAAYJwwxVcvl7frDd+gxbFHVw== 2026-03-08T22:56:12.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCM/q1ptofiIxAAYJwwxVcvl7frDd+gxbFHVw=="}' 2026-03-08T22:56:12.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 7cbb2111-156d-48ba-816d-e481e0fc9313 -i td/crush-classes/2/new.json 2026-03-08T22:56:12.721 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:56:12.729 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-classes/2/new.json 2026-03-08T22:56:12.730 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/2 --osd-journal=td/crush-classes/2/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCM/q1ptofiIxAAYJwwxVcvl7frDd+gxbFHVw== --osd-uuid 7cbb2111-156d-48ba-816d-e481e0fc9313 2026-03-08T22:56:12.750 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:12.751+0000 7fd9b51d4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:12.753 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:12.753+0000 7fd9b51d4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:12.755 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:12.755+0000 7fd9b51d4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:12.755 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:12.755+0000 7fd9b51d4780 -1 bdev(0x5589aa479c00 td/crush-classes/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:56:12.755 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:12.755+0000 7fd9b51d4780 -1 bluestore(td/crush-classes/2) _read_fsid unparsable uuid 2026-03-08T22:56:14.909 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-classes/2/keyring 2026-03-08T22:56:14.910 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:56:14.910 INFO:tasks.workunit.client.0.vm10.stdout:adding osd2 key to auth repository 2026-03-08T22:56:14.910 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:56:14.911 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-classes/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:56:15.031 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:56:15.031 INFO:tasks.workunit.client.0.vm10.stdout:start osd.2 2026-03-08T22:56:15.032 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/2 --osd-journal=td/crush-classes/2/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:56:15.032 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:56:15.033 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:56:15.035 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:56:15.052 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:15.051+0000 7f4855016780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:15.061 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:15.062+0000 7f4855016780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:15.062 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:15.063+0000 7f4855016780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:15.152 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:56:15.152 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:56:15.152 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:56:15.152 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:56:15.152 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:56:15.152 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:15.152 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:56:15.152 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:56:15.152 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:15.152 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:56:15.266 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:16.140 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:16.141+0000 7f4855016780 -1 Falling back to public interface 2026-03-08T22:56:16.267 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:16.268 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:16.268 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:56:16.268 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:56:16.268 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:16.268 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:56:16.383 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:17.015 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:17.015+0000 7f4855016780 -1 osd.2 0 log_to_monitors true 2026-03-08T22:56:17.385 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:56:17.385 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:17.385 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:17.385 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:56:17.386 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:56:17.386 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:17.520 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:18.224 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:18.224+0000 7f4850963640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T22:56:18.522 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:56:18.522 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:18.522 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:18.522 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:56:18.522 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:18.522 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:56:18.639 INFO:tasks.workunit.client.0.vm10.stdout:osd.2 up in weight 1 up_from 12 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6816/3622669026,v1:127.0.0.1:6817/3622669026] [v2:127.0.0.1:6818/3622669026,v1:127.0.0.1:6819/3622669026] exists,up 7cbb2111-156d-48ba-816d-e481e0fc9313 2026-03-08T22:56:18.639 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:56:18.639 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:56:18.639 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:56:18.639 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:95: TEST_classes: create_rbd_pool 2026-03-08T22:56:18.639 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T22:56:18.783 INFO:tasks.workunit.client.0.vm10.stderr:pool 'rbd' does not exist 2026-03-08T22:56:18.791 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T22:56:18.791 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:56:18.990 INFO:tasks.workunit.client.0.vm10.stderr:pool 'rbd' already exists 2026-03-08T22:56:18.998 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:56:20.000 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T22:56:20.359 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:97: TEST_classes: get_osds_up rbd SOMETHING 2026-03-08T22:56:20.360 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:51: get_osds_up: local poolname=rbd 2026-03-08T22:56:20.360 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:52: get_osds_up: local objectname=SOMETHING 2026-03-08T22:56:20.360 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: ceph --format xml osd map rbd SOMETHING 2026-03-08T22:56:20.362 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: xmlstarlet sel -t -m //up/osd -v . -o ' ' 2026-03-08T22:56:20.492 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: local 'osds=1 2 0 ' 2026-03-08T22:56:20.492 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:57: get_osds_up: echo 1 2 0 2026-03-08T22:56:20.492 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:97: TEST_classes: test '1 2 0' == '1 2 0' 2026-03-08T22:56:20.492 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:98: TEST_classes: add_something td/crush-classes SOMETHING 2026-03-08T22:56:20.492 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:42: add_something: local dir=td/crush-classes 2026-03-08T22:56:20.492 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:43: add_something: local obj=SOMETHING 2026-03-08T22:56:20.492 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:45: add_something: local payload=ABCDEF 2026-03-08T22:56:20.492 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:46: add_something: echo ABCDEF 2026-03-08T22:56:20.492 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:47: add_something: rados --pool rbd put SOMETHING td/crush-classes/ORIGINAL 2026-03-08T22:56:20.520 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:104: TEST_classes: ceph osd getcrushmap 2026-03-08T22:56:20.630 INFO:tasks.workunit.client.0.vm10.stderr:4 2026-03-08T22:56:20.638 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:105: TEST_classes: crushtool -d td/crush-classes/map -o td/crush-classes/map.txt 2026-03-08T22:56:20.653 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:106: TEST_classes: sed -i -e '/device 0 osd.0/s/$/ class ssd/' -e '/step take default/s/$/ class ssd/' td/crush-classes/map.txt 2026-03-08T22:56:20.654 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:110: TEST_classes: crushtool -c td/crush-classes/map.txt -o td/crush-classes/map-new 2026-03-08T22:56:20.668 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:111: TEST_classes: ceph osd setcrushmap -i td/crush-classes/map-new 2026-03-08T22:56:20.926 INFO:tasks.workunit.client.0.vm10.stderr:6 2026-03-08T22:56:20.946 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:117: TEST_classes: ok=false 2026-03-08T22:56:20.946 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:118: TEST_classes: for delay in 2 4 8 16 32 64 128 256 2026-03-08T22:56:20.946 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:119: TEST_classes: get_osds_up rbd SOMETHING_ELSE 2026-03-08T22:56:20.946 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:51: get_osds_up: local poolname=rbd 2026-03-08T22:56:20.946 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:52: get_osds_up: local objectname=SOMETHING_ELSE 2026-03-08T22:56:20.947 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: ceph --format xml osd map rbd SOMETHING_ELSE 2026-03-08T22:56:20.947 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: xmlstarlet sel -t -m //up/osd -v . -o ' ' 2026-03-08T22:56:21.077 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: local 'osds=0 ' 2026-03-08T22:56:21.077 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:57: get_osds_up: echo 0 2026-03-08T22:56:21.077 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:119: TEST_classes: test 0 == 0 2026-03-08T22:56:21.077 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:120: TEST_classes: ok=true 2026-03-08T22:56:21.077 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:121: TEST_classes: break 2026-03-08T22:56:21.077 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:127: TEST_classes: true 2026-03-08T22:56:21.077 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:132: TEST_classes: add_something td/crush-classes SOMETHING_ELSE 2026-03-08T22:56:21.077 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:42: add_something: local dir=td/crush-classes 2026-03-08T22:56:21.077 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:43: add_something: local obj=SOMETHING_ELSE 2026-03-08T22:56:21.077 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:45: add_something: local payload=ABCDEF 2026-03-08T22:56:21.077 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:46: add_something: echo ABCDEF 2026-03-08T22:56:21.077 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:47: add_something: rados --pool rbd put SOMETHING_ELSE td/crush-classes/ORIGINAL 2026-03-08T22:56:21.106 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:138: TEST_classes: ceph osd crush dump 2026-03-08T22:56:21.106 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:138: TEST_classes: grep -q '~ssd' 2026-03-08T22:56:21.223 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:37: run: teardown td/crush-classes 2026-03-08T22:56:21.223 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/crush-classes 2026-03-08T22:56:21.223 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:56:21.223 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/crush-classes KILL 2026-03-08T22:56:21.223 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:56:21.223 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:56:21.223 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:56:21.223 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:56:21.223 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:56:21.337 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:56:21.337 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:56:21.338 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:56:21.338 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:56:21.339 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:56:21.339 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:56:21.339 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:56:21.340 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:56:21.340 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:56:21.340 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:56:21.340 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:56:21.341 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:56:21.342 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:56:21.342 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/crush-classes 2026-03-08T22:56:21.357 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:56:21.357 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:21.357 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:56:21.357 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.66179 2026-03-08T22:56:21.358 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:56:21.358 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:56:21.358 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:34: run: for func in $funcs 2026-03-08T22:56:21.358 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:35: run: setup td/crush-classes 2026-03-08T22:56:21.358 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/crush-classes 2026-03-08T22:56:21.358 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/crush-classes 2026-03-08T22:56:21.358 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/crush-classes 2026-03-08T22:56:21.358 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:56:21.359 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/crush-classes KILL 2026-03-08T22:56:21.359 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:56:21.359 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:56:21.359 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:56:21.359 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:56:21.359 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:56:21.360 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:56:21.360 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:56:21.361 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:56:21.361 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:56:21.362 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:56:21.362 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:56:21.362 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:56:21.363 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:56:21.363 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:56:21.363 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:56:21.363 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:56:21.364 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:56:21.365 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:56:21.365 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/crush-classes 2026-03-08T22:56:21.366 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:56:21.366 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:21.366 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:56:21.366 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.66179 2026-03-08T22:56:21.367 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:56:21.367 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:56:21.367 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/crush-classes 2026-03-08T22:56:21.368 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:56:21.368 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:21.368 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:56:21.368 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.66179 2026-03-08T22:56:21.369 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:56:21.369 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:56:21.369 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:56:21.369 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/crush-classes 1' TERM HUP INT 2026-03-08T22:56:21.370 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:36: run: TEST_mon_classes td/crush-classes 2026-03-08T22:56:21.370 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:167: TEST_mon_classes: local dir=td/crush-classes 2026-03-08T22:56:21.370 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:169: TEST_mon_classes: run_mon td/crush-classes a 2026-03-08T22:56:21.370 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/crush-classes 2026-03-08T22:56:21.370 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:56:21.370 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:56:21.370 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:56:21.370 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/crush-classes/a 2026-03-08T22:56:21.370 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/crush-classes/a --run-dir=td/crush-classes 2026-03-08T22:56:21.395 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:56:21.395 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:21.395 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:21.395 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:21.395 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:21.395 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:56:21.395 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:56:21.395 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/crush-classes/a '--log-file=td/crush-classes/$name.log' '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --mon-cluster-log-file=td/crush-classes/log --run-dir=td/crush-classes '--pid-file=td/crush-classes/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:56:21.425 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:56:21.425 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:56:21.425 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:56:21.425 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:56:21.425 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:56:21.426 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:56:21.426 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:56:21.426 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:56:21.426 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:56:21.426 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:56:21.426 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:21.426 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:56:21.426 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.66179/ceph-mon.a.asok 2026-03-08T22:56:21.427 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:56:21.427 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.66179/ceph-mon.a.asok config get fsid 2026-03-08T22:56:21.482 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:56:21.482 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:56:21.482 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:56:21.482 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:56:21.482 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:56:21.482 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:56:21.482 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:56:21.482 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:56:21.482 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:56:21.482 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:21.482 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:56:21.482 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.66179/ceph-mon.a.asok 2026-03-08T22:56:21.483 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:56:21.483 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.66179/ceph-mon.a.asok config get mon_host 2026-03-08T22:56:21.537 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:170: TEST_mon_classes: run_osd td/crush-classes 0 2026-03-08T22:56:21.537 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-classes 2026-03-08T22:56:21.537 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:56:21.537 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:56:21.537 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:56:21.537 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-classes/0 2026-03-08T22:56:21.537 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-08T22:56:21.537 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:56:21.537 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:56:21.537 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:56:21.537 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-classes/0' 2026-03-08T22:56:21.537 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-classes/0/journal' 2026-03-08T22:56:21.537 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:56:21.537 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:56:21.537 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-08T22:56:21.537 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:56:21.537 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:21.537 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:21.538 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:21.538 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:21.538 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:56:21.538 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:56:21.538 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:56:21.539 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:56:21.539 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:56:21.539 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:56:21.539 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-08T22:56:21.539 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-08T22:56:21.539 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:56:21.539 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:56:21.539 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:56:21.539 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:56:21.539 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:56:21.539 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:56:21.539 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-classes/0 2026-03-08T22:56:21.540 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:56:21.541 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=6f69e997-1432-4db4-a5c0-263a9cb71c69 2026-03-08T22:56:21.541 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 6f69e997-1432-4db4-a5c0-263a9cb71c69' 2026-03-08T22:56:21.541 INFO:tasks.workunit.client.0.vm10.stdout:add osd0 6f69e997-1432-4db4-a5c0-263a9cb71c69 2026-03-08T22:56:21.541 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:56:21.557 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCV/q1p2sc1IRAAqPyoHa9rkF72LT/0jqnaig== 2026-03-08T22:56:21.557 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCV/q1p2sc1IRAAqPyoHa9rkF72LT/0jqnaig=="}' 2026-03-08T22:56:21.557 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 6f69e997-1432-4db4-a5c0-263a9cb71c69 -i td/crush-classes/0/new.json 2026-03-08T22:56:21.678 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:56:21.685 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-classes/0/new.json 2026-03-08T22:56:21.685 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/0 --osd-journal=td/crush-classes/0/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCV/q1p2sc1IRAAqPyoHa9rkF72LT/0jqnaig== --osd-uuid 6f69e997-1432-4db4-a5c0-263a9cb71c69 2026-03-08T22:56:21.706 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:21.707+0000 7f43d9cd2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:21.708 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:21.709+0000 7f43d9cd2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:21.709 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:21.709+0000 7f43d9cd2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:21.710 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:21.710+0000 7f43d9cd2780 -1 bdev(0x55737d8cbc00 td/crush-classes/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:56:21.710 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:21.710+0000 7f43d9cd2780 -1 bluestore(td/crush-classes/0) _read_fsid unparsable uuid 2026-03-08T22:56:23.878 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-classes/0/keyring 2026-03-08T22:56:23.878 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:56:23.879 INFO:tasks.workunit.client.0.vm10.stdout:adding osd0 key to auth repository 2026-03-08T22:56:23.879 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:56:23.879 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-classes/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:56:24.009 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:56:24.010 INFO:tasks.workunit.client.0.vm10.stdout:start osd.0 2026-03-08T22:56:24.010 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/0 --osd-journal=td/crush-classes/0/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:56:24.010 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:56:24.011 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:56:24.013 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:56:24.029 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:24.029+0000 7f4f3afa3780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:24.039 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:24.039+0000 7f4f3afa3780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:24.040 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:24.040+0000 7f4f3afa3780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:24.130 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:56:24.131 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:56:24.131 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:56:24.131 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:56:24.131 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:56:24.131 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:24.131 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:56:24.131 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:56:24.131 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:24.131 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:56:24.324 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:24.875 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:24.876+0000 7f4f3afa3780 -1 Falling back to public interface 2026-03-08T22:56:25.326 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:25.326 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:25.326 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:56:25.326 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:56:25.327 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:25.327 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:56:25.453 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:25.719 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:25.720+0000 7f4f3afa3780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:56:26.455 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:56:26.455 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:26.456 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:26.456 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:56:26.456 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:26.456 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:56:26.569 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:27.571 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:56:27.571 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:27.571 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:27.571 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:56:27.572 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:27.572 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:56:27.696 INFO:tasks.workunit.client.0.vm10.stdout:osd.0 up in weight 1 up_from 4 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6800/1496739783,v1:127.0.0.1:6801/1496739783] [v2:127.0.0.1:6802/1496739783,v1:127.0.0.1:6803/1496739783] exists,up 6f69e997-1432-4db4-a5c0-263a9cb71c69 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:171: TEST_mon_classes: run_osd td/crush-classes 1 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-classes 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-classes/1 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-classes/1' 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-classes/1/journal' 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:27.697 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:27.698 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:27.698 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:27.698 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:56:27.698 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:56:27.698 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:56:27.698 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:56:27.698 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:56:27.698 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:56:27.698 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-08T22:56:27.698 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-08T22:56:27.698 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:56:27.698 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:56:27.698 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:56:27.698 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:56:27.698 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:56:27.698 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:56:27.698 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-classes/1 2026-03-08T22:56:27.699 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:56:27.700 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=9413af4f-f79e-4862-8683-7ab06c1f0f2d 2026-03-08T22:56:27.700 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 9413af4f-f79e-4862-8683-7ab06c1f0f2d' 2026-03-08T22:56:27.700 INFO:tasks.workunit.client.0.vm10.stdout:add osd1 9413af4f-f79e-4862-8683-7ab06c1f0f2d 2026-03-08T22:56:27.700 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:56:27.714 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCb/q1pKtWPKhAAwWmH+YRXDY6sdFyGIRyIDg== 2026-03-08T22:56:27.715 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCb/q1pKtWPKhAAwWmH+YRXDY6sdFyGIRyIDg=="}' 2026-03-08T22:56:27.715 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 9413af4f-f79e-4862-8683-7ab06c1f0f2d -i td/crush-classes/1/new.json 2026-03-08T22:56:27.847 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:56:27.856 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-classes/1/new.json 2026-03-08T22:56:27.857 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/1 --osd-journal=td/crush-classes/1/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCb/q1pKtWPKhAAwWmH+YRXDY6sdFyGIRyIDg== --osd-uuid 9413af4f-f79e-4862-8683-7ab06c1f0f2d 2026-03-08T22:56:27.878 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:27.878+0000 7feadc1f9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:27.881 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:27.881+0000 7feadc1f9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:27.882 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:27.882+0000 7feadc1f9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:27.883 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:27.883+0000 7feadc1f9780 -1 bdev(0x55b374041c00 td/crush-classes/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:56:27.883 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:27.883+0000 7feadc1f9780 -1 bluestore(td/crush-classes/1) _read_fsid unparsable uuid 2026-03-08T22:56:30.445 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-classes/1/keyring 2026-03-08T22:56:30.445 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:56:30.446 INFO:tasks.workunit.client.0.vm10.stdout:adding osd1 key to auth repository 2026-03-08T22:56:30.446 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:56:30.446 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-classes/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:56:30.571 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:56:30.571 INFO:tasks.workunit.client.0.vm10.stdout:start osd.1 2026-03-08T22:56:30.571 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/1 --osd-journal=td/crush-classes/1/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:56:30.571 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:56:30.572 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:56:30.574 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:56:30.591 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:30.591+0000 7fcba9ad2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:30.599 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:30.600+0000 7fcba9ad2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:30.600 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:30.601+0000 7fcba9ad2780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:30.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:56:30.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:56:30.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:56:30.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:56:30.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:56:30.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:30.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:56:30.697 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:56:30.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:30.697 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:30.832 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:31.429 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:31.429+0000 7fcba9ad2780 -1 Falling back to public interface 2026-03-08T22:56:31.833 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:31.833 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:31.833 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:56:31.833 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:56:31.834 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:31.834 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:31.943 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:32.553 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:32.553+0000 7fcba9ad2780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:56:32.944 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:56:32.944 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:32.944 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:32.944 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:56:32.945 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:32.945 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:33.256 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:34.258 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:56:34.258 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:34.258 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:34.259 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:56:34.259 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:34.259 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:34.397 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:35.399 INFO:tasks.workunit.client.0.vm10.stdout:4 2026-03-08T22:56:35.399 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:35.399 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:35.399 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:56:35.399 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:35.399 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:35.508 INFO:tasks.workunit.client.0.vm10.stdout:osd.1 up in weight 1 up_from 8 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6808/2443304427,v1:127.0.0.1:6809/2443304427] [v2:127.0.0.1:6810/2443304427,v1:127.0.0.1:6811/2443304427] exists,up 9413af4f-f79e-4862-8683-7ab06c1f0f2d 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:172: TEST_mon_classes: run_osd td/crush-classes 2 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-classes 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-classes/2 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-classes/2' 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-classes/2/journal' 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:35.509 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:35.510 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:35.510 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:35.510 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:56:35.510 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:56:35.510 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:56:35.510 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:56:35.510 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:56:35.511 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:56:35.511 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-08T22:56:35.511 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-08T22:56:35.511 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:56:35.511 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:56:35.511 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:56:35.511 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:56:35.511 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:56:35.511 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:56:35.511 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-classes/2 2026-03-08T22:56:35.512 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:56:35.513 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=818a2d78-2b7b-49e4-9320-96c89a4d5378 2026-03-08T22:56:35.513 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 818a2d78-2b7b-49e4-9320-96c89a4d5378' 2026-03-08T22:56:35.513 INFO:tasks.workunit.client.0.vm10.stdout:add osd2 818a2d78-2b7b-49e4-9320-96c89a4d5378 2026-03-08T22:56:35.513 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:56:35.528 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCj/q1pPvRyHxAAVKluEhgyXMP+4RbR7ETwDw== 2026-03-08T22:56:35.528 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCj/q1pPvRyHxAAVKluEhgyXMP+4RbR7ETwDw=="}' 2026-03-08T22:56:35.528 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 818a2d78-2b7b-49e4-9320-96c89a4d5378 -i td/crush-classes/2/new.json 2026-03-08T22:56:35.662 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:56:35.671 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-classes/2/new.json 2026-03-08T22:56:35.672 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/2 --osd-journal=td/crush-classes/2/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCj/q1pPvRyHxAAVKluEhgyXMP+4RbR7ETwDw== --osd-uuid 818a2d78-2b7b-49e4-9320-96c89a4d5378 2026-03-08T22:56:35.696 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:35.696+0000 7f8b73c10780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:35.698 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:35.698+0000 7f8b73c10780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:35.699 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:35.700+0000 7f8b73c10780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:35.700 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:35.700+0000 7f8b73c10780 -1 bdev(0x55672220bc00 td/crush-classes/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:56:35.700 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:35.700+0000 7f8b73c10780 -1 bluestore(td/crush-classes/2) _read_fsid unparsable uuid 2026-03-08T22:56:38.135 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-classes/2/keyring 2026-03-08T22:56:38.135 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:56:38.139 INFO:tasks.workunit.client.0.vm10.stdout:adding osd2 key to auth repository 2026-03-08T22:56:38.139 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:56:38.139 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-classes/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:56:38.265 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:56:38.265 INFO:tasks.workunit.client.0.vm10.stdout:start osd.2 2026-03-08T22:56:38.265 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/2 --osd-journal=td/crush-classes/2/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:56:38.266 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:56:38.267 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:56:38.268 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:56:38.285 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:38.285+0000 7fc5c9584780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:38.314 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:38.315+0000 7fc5c9584780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:38.315 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:38.316+0000 7fc5c9584780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:38.384 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:56:38.385 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:56:38.385 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:56:38.385 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:56:38.385 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:56:38.385 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:38.385 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:56:38.385 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:56:38.385 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:38.385 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:56:38.508 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:39.418 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:39.419+0000 7fc5c9584780 -1 Falling back to public interface 2026-03-08T22:56:39.510 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:39.510 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:39.510 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:56:39.510 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:56:39.511 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:39.511 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:56:39.621 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:40.333 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:40.333+0000 7fc5c9584780 -1 osd.2 0 log_to_monitors true 2026-03-08T22:56:40.622 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:40.622 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:40.622 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:56:40.622 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:56:40.624 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:40.624 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:56:40.751 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:41.752 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:56:41.752 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:41.752 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:41.752 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:56:41.753 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:41.753 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:56:41.862 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:42.506 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:42.506+0000 7fc5c5526640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T22:56:42.863 INFO:tasks.workunit.client.0.vm10.stdout:4 2026-03-08T22:56:42.863 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:42.863 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:42.863 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:56:42.864 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:56:42.864 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:42.977 INFO:tasks.workunit.client.0.vm10.stdout:osd.2 up in weight 1 up_from 12 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6816/4231763640,v1:127.0.0.1:6817/4231763640] [v2:127.0.0.1:6818/4231763640,v1:127.0.0.1:6819/4231763640] exists,up 818a2d78-2b7b-49e4-9320-96c89a4d5378 2026-03-08T22:56:42.977 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:56:42.977 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:56:42.977 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:56:42.977 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:173: TEST_mon_classes: create_rbd_pool 2026-03-08T22:56:42.977 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T22:56:43.086 INFO:tasks.workunit.client.0.vm10.stderr:pool 'rbd' does not exist 2026-03-08T22:56:43.093 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T22:56:43.093 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:56:43.228 INFO:tasks.workunit.client.0.vm10.stderr:pool 'rbd' already exists 2026-03-08T22:56:43.239 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:56:44.241 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T22:56:44.621 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:175: TEST_mon_classes: get_osds_up rbd SOMETHING 2026-03-08T22:56:44.622 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:51: get_osds_up: local poolname=rbd 2026-03-08T22:56:44.622 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:52: get_osds_up: local objectname=SOMETHING 2026-03-08T22:56:44.622 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: ceph --format xml osd map rbd SOMETHING 2026-03-08T22:56:44.622 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: xmlstarlet sel -t -m //up/osd -v . -o ' ' 2026-03-08T22:56:44.739 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: local 'osds=1 2 0 ' 2026-03-08T22:56:44.739 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:57: get_osds_up: echo 1 2 0 2026-03-08T22:56:44.739 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:175: TEST_mon_classes: test '1 2 0' == '1 2 0' 2026-03-08T22:56:44.739 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:176: TEST_mon_classes: add_something td/crush-classes SOMETHING 2026-03-08T22:56:44.739 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:42: add_something: local dir=td/crush-classes 2026-03-08T22:56:44.739 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:43: add_something: local obj=SOMETHING 2026-03-08T22:56:44.739 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:45: add_something: local payload=ABCDEF 2026-03-08T22:56:44.739 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:46: add_something: echo ABCDEF 2026-03-08T22:56:44.739 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:47: add_something: rados --pool rbd put SOMETHING td/crush-classes/ORIGINAL 2026-03-08T22:56:44.765 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:179: TEST_mon_classes: ceph osd crush class create CLASS 2026-03-08T22:56:45.006 INFO:tasks.workunit.client.0.vm10.stderr:class 'CLASS' already exists 2026-03-08T22:56:45.013 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:180: TEST_mon_classes: ceph osd crush class create CLASS 2026-03-08T22:56:45.122 INFO:tasks.workunit.client.0.vm10.stderr:class 'CLASS' already exists 2026-03-08T22:56:45.130 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:181: TEST_mon_classes: ceph osd crush class ls 2026-03-08T22:56:45.130 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:181: TEST_mon_classes: grep CLASS 2026-03-08T22:56:45.245 INFO:tasks.workunit.client.0.vm10.stdout: "CLASS" 2026-03-08T22:56:45.245 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:182: TEST_mon_classes: ceph osd crush class rename CLASS TEMP 2026-03-08T22:56:45.389 INFO:tasks.workunit.client.0.vm10.stderr:already renamed to 'TEMP' 2026-03-08T22:56:45.397 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:183: TEST_mon_classes: ceph osd crush class ls 2026-03-08T22:56:45.397 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:183: TEST_mon_classes: grep TEMP 2026-03-08T22:56:45.588 INFO:tasks.workunit.client.0.vm10.stdout: "TEMP" 2026-03-08T22:56:45.589 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:184: TEST_mon_classes: ceph osd crush class rename TEMP CLASS 2026-03-08T22:56:45.787 INFO:tasks.workunit.client.0.vm10.stderr:already renamed to 'CLASS' 2026-03-08T22:56:45.796 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:185: TEST_mon_classes: ceph osd crush class ls 2026-03-08T22:56:45.796 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:185: TEST_mon_classes: grep CLASS 2026-03-08T22:56:45.908 INFO:tasks.workunit.client.0.vm10.stdout: "CLASS" 2026-03-08T22:56:45.912 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:186: TEST_mon_classes: ceph osd erasure-code-profile set myprofile plugin=jerasure technique=reed_sol_van k=2 m=1 crush-failure-domain=osd crush-device-class=CLASS 2026-03-08T22:56:46.111 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:187: TEST_mon_classes: expect_failure td/crush-classes EBUSY ceph osd crush class rm CLASS 2026-03-08T22:56:46.111 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2025: expect_failure: local dir=td/crush-classes 2026-03-08T22:56:46.111 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2026: expect_failure: shift 2026-03-08T22:56:46.111 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2027: expect_failure: local expected=EBUSY 2026-03-08T22:56:46.111 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2028: expect_failure: shift 2026-03-08T22:56:46.111 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2029: expect_failure: local success 2026-03-08T22:56:46.111 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2031: expect_failure: ceph osd crush class rm CLASS 2026-03-08T22:56:46.202 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2034: expect_failure: success=false 2026-03-08T22:56:46.202 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2037: expect_failure: false 2026-03-08T22:56:46.202 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2037: expect_failure: grep --quiet EBUSY td/crush-classes/out 2026-03-08T22:56:46.203 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2041: expect_failure: return 0 2026-03-08T22:56:46.203 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:188: TEST_mon_classes: ceph osd erasure-code-profile rm myprofile 2026-03-08T22:56:46.554 INFO:tasks.workunit.client.0.vm10.stderr:erasure-code-profile myprofile does not exist 2026-03-08T22:56:46.563 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:189: TEST_mon_classes: ceph osd crush class rm CLASS 2026-03-08T22:56:46.776 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:190: TEST_mon_classes: ceph osd crush class rm CLASS 2026-03-08T22:56:46.899 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:193: TEST_mon_classes: ceph osd crush set-device-class aaa osd.0 2026-03-08T22:56:47.307 INFO:tasks.workunit.client.0.vm10.stderr:osd.0 already set to class aaa. set-device-class item id 0 name 'osd.0' device_class 'aaa': no change. set osd(s) to class 'aaa' 2026-03-08T22:56:47.315 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:194: TEST_mon_classes: ceph osd tree 2026-03-08T22:56:47.360 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:194: TEST_mon_classes: grep -q aaa 2026-03-08T22:56:47.485 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:195: TEST_mon_classes: ceph osd crush dump 2026-03-08T22:56:47.485 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:195: TEST_mon_classes: grep -q '~aaa' 2026-03-08T22:56:47.728 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:196: TEST_mon_classes: ceph osd crush tree --show-shadow 2026-03-08T22:56:47.728 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:196: TEST_mon_classes: grep -q '~aaa' 2026-03-08T22:56:47.861 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:197: TEST_mon_classes: ceph osd crush set-device-class bbb osd.1 2026-03-08T22:56:48.072 INFO:tasks.workunit.client.0.vm10.stderr:osd.1 already set to class bbb. set-device-class item id 1 name 'osd.1' device_class 'bbb': no change. set osd(s) to class 'bbb' 2026-03-08T22:56:48.084 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:198: TEST_mon_classes: ceph osd tree 2026-03-08T22:56:48.084 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:198: TEST_mon_classes: grep -q bbb 2026-03-08T22:56:48.195 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:199: TEST_mon_classes: ceph osd crush dump 2026-03-08T22:56:48.196 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:199: TEST_mon_classes: grep -q '~bbb' 2026-03-08T22:56:48.312 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:200: TEST_mon_classes: ceph osd crush tree --show-shadow 2026-03-08T22:56:48.313 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:200: TEST_mon_classes: grep -q '~bbb' 2026-03-08T22:56:48.433 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:201: TEST_mon_classes: ceph osd crush set-device-class ccc osd.2 2026-03-08T22:56:48.688 INFO:tasks.workunit.client.0.vm10.stderr:osd.2 already set to class ccc. set-device-class item id 2 name 'osd.2' device_class 'ccc': no change. set osd(s) to class 'ccc' 2026-03-08T22:56:48.699 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:202: TEST_mon_classes: ceph osd tree 2026-03-08T22:56:48.699 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:202: TEST_mon_classes: grep -q ccc 2026-03-08T22:56:48.812 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:203: TEST_mon_classes: ceph osd crush dump 2026-03-08T22:56:48.812 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:203: TEST_mon_classes: grep -q '~ccc' 2026-03-08T22:56:48.923 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:204: TEST_mon_classes: ceph osd crush tree --show-shadow 2026-03-08T22:56:48.923 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:204: TEST_mon_classes: grep -q '~ccc' 2026-03-08T22:56:49.041 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:205: TEST_mon_classes: ceph osd crush rm-device-class 0 2026-03-08T22:56:49.299 INFO:tasks.workunit.client.0.vm10.stderr:osd.0 belongs to no class, done removing class of osd(s): 2026-03-08T22:56:49.308 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:206: TEST_mon_classes: ceph osd tree 2026-03-08T22:56:49.308 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:206: TEST_mon_classes: grep -q aaa 2026-03-08T22:56:49.426 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:207: TEST_mon_classes: ceph osd crush class ls 2026-03-08T22:56:49.427 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:207: TEST_mon_classes: grep -q aaa 2026-03-08T22:56:49.536 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:208: TEST_mon_classes: ceph osd crush rm-device-class 1 2026-03-08T22:56:49.809 INFO:tasks.workunit.client.0.vm10.stderr:osd.1 belongs to no class, done removing class of osd(s): 2026-03-08T22:56:49.819 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:209: TEST_mon_classes: ceph osd tree 2026-03-08T22:56:49.819 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:209: TEST_mon_classes: grep -q bbb 2026-03-08T22:56:49.927 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:210: TEST_mon_classes: ceph osd crush class ls 2026-03-08T22:56:49.927 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:210: TEST_mon_classes: grep -q bbb 2026-03-08T22:56:50.044 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:211: TEST_mon_classes: ceph osd crush rm-device-class 2 2026-03-08T22:56:50.316 INFO:tasks.workunit.client.0.vm10.stderr:osd.2 belongs to no class, done removing class of osd(s): 2026-03-08T22:56:50.327 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:212: TEST_mon_classes: ceph osd tree 2026-03-08T22:56:50.327 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:212: TEST_mon_classes: grep -q ccc 2026-03-08T22:56:50.440 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:213: TEST_mon_classes: ceph osd crush class ls 2026-03-08T22:56:50.440 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:213: TEST_mon_classes: grep -q ccc 2026-03-08T22:56:50.552 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:214: TEST_mon_classes: ceph osd crush set-device-class asdf all 2026-03-08T22:56:50.827 INFO:tasks.workunit.client.0.vm10.stderr:osd.0 already set to class asdf. osd.1 already set to class asdf. osd.2 already set to class asdf. set osd(s) to class 'asdf' 2026-03-08T22:56:50.838 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:215: TEST_mon_classes: ceph osd tree 2026-03-08T22:56:50.838 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:215: TEST_mon_classes: grep -q asdf 2026-03-08T22:56:50.956 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:216: TEST_mon_classes: ceph osd crush dump 2026-03-08T22:56:50.956 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:216: TEST_mon_classes: grep -q '~asdf' 2026-03-08T22:56:51.070 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:217: TEST_mon_classes: ceph osd crush tree --show-shadow 2026-03-08T22:56:51.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:217: TEST_mon_classes: grep -q '~asdf' 2026-03-08T22:56:51.196 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:218: TEST_mon_classes: ceph osd crush rule create-replicated asdf-rule default host asdf 2026-03-08T22:56:51.358 INFO:tasks.workunit.client.0.vm10.stderr:rule asdf-rule already exists 2026-03-08T22:56:51.366 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:219: TEST_mon_classes: ceph osd crush rm-device-class all 2026-03-08T22:56:51.663 INFO:tasks.workunit.client.0.vm10.stderr:osd.0 belongs to no class, osd.1 belongs to no class, osd.2 belongs to no class, done removing class of osd(s): 2026-03-08T22:56:51.674 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:220: TEST_mon_classes: ceph osd tree 2026-03-08T22:56:51.674 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:220: TEST_mon_classes: grep -q asdf 2026-03-08T22:56:51.795 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:221: TEST_mon_classes: ceph osd crush class ls 2026-03-08T22:56:51.795 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:221: TEST_mon_classes: grep -q asdf 2026-03-08T22:56:51.911 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:223: TEST_mon_classes: ceph osd crush set-device-class abc osd.2 2026-03-08T22:56:52.175 INFO:tasks.workunit.client.0.vm10.stderr:osd.2 already set to class abc. set-device-class item id 2 name 'osd.2' device_class 'abc': no change. set osd(s) to class 'abc' 2026-03-08T22:56:52.185 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:224: TEST_mon_classes: ceph osd crush move osd.2 root=foo rack=foo-rack host=foo-host 2026-03-08T22:56:52.306 INFO:tasks.workunit.client.0.vm10.stderr:no need to move item id 2 name 'osd.2' to location {host=foo-host,rack=foo-rack,root=foo} in crush map 2026-03-08T22:56:52.314 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:225: TEST_mon_classes: ceph osd tree 2026-03-08T22:56:52.314 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:225: TEST_mon_classes: awk '$1 == 2 && $2 == "abc" {print $0}' 2026-03-08T22:56:52.432 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:225: TEST_mon_classes: out=' 2 abc 0.09769 osd.2 up 1.00000 1.00000' 2026-03-08T22:56:52.433 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:226: TEST_mon_classes: '[' ' 2 abc 0.09769 osd.2 up 1.00000 1.00000' == '' ']' 2026-03-08T22:56:52.433 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:231: TEST_mon_classes: ceph osd crush dump 2026-03-08T22:56:52.433 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:231: TEST_mon_classes: grep -q foo~abc 2026-03-08T22:56:52.552 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:232: TEST_mon_classes: ceph osd crush tree --show-shadow 2026-03-08T22:56:52.552 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:232: TEST_mon_classes: grep -q foo~abc 2026-03-08T22:56:52.677 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:233: TEST_mon_classes: ceph osd crush dump 2026-03-08T22:56:52.677 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:233: TEST_mon_classes: grep -q foo-rack~abc 2026-03-08T22:56:52.800 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:234: TEST_mon_classes: ceph osd crush tree --show-shadow 2026-03-08T22:56:52.800 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:234: TEST_mon_classes: grep -q foo-rack~abc 2026-03-08T22:56:52.918 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:235: TEST_mon_classes: ceph osd crush dump 2026-03-08T22:56:52.918 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:235: TEST_mon_classes: grep -q foo-host~abc 2026-03-08T22:56:53.034 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:236: TEST_mon_classes: ceph osd crush tree --show-shadow 2026-03-08T22:56:53.034 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:236: TEST_mon_classes: grep -q foo-host~abc 2026-03-08T22:56:53.163 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:237: TEST_mon_classes: ceph osd crush rm-device-class osd.2 2026-03-08T22:56:53.471 INFO:tasks.workunit.client.0.vm10.stderr:osd.2 belongs to no class, done removing class of osd(s): 2026-03-08T22:56:53.481 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:239: TEST_mon_classes: ceph osd crush set-device-class abc osd.2 2026-03-08T22:56:53.820 INFO:tasks.workunit.client.0.vm10.stderr:osd.2 already set to class abc. set-device-class item id 2 name 'osd.2' device_class 'abc': no change. set osd(s) to class 'abc' 2026-03-08T22:56:53.830 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:241: TEST_mon_classes: ceph osd crush rule create-replicated foo-rule foo host abc 2026-03-08T22:56:54.008 INFO:tasks.workunit.client.0.vm10.stderr:rule foo-rule already exists 2026-03-08T22:56:54.017 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:244: TEST_mon_classes: ceph osd crush set-device-class hdd osd.0 2026-03-08T22:56:54.303 INFO:tasks.workunit.client.0.vm10.stderr:osd.0 already set to class hdd. set-device-class item id 0 name 'osd.0' device_class 'hdd': no change. set osd(s) to class 'hdd' 2026-03-08T22:56:54.315 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:245: TEST_mon_classes: expect_failure td/crush-classes EBUSY ceph osd crush set-device-class nvme osd.0 2026-03-08T22:56:54.315 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2025: expect_failure: local dir=td/crush-classes 2026-03-08T22:56:54.315 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2026: expect_failure: shift 2026-03-08T22:56:54.315 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2027: expect_failure: local expected=EBUSY 2026-03-08T22:56:54.315 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2028: expect_failure: shift 2026-03-08T22:56:54.315 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2029: expect_failure: local success 2026-03-08T22:56:54.315 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2031: expect_failure: ceph osd crush set-device-class nvme osd.0 2026-03-08T22:56:54.408 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2034: expect_failure: success=false 2026-03-08T22:56:54.408 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2037: expect_failure: false 2026-03-08T22:56:54.408 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2037: expect_failure: grep --quiet EBUSY td/crush-classes/out 2026-03-08T22:56:54.409 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2041: expect_failure: return 0 2026-03-08T22:56:54.409 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:248: TEST_mon_classes: ceph osd crush rm-device-class all 2026-03-08T22:56:54.616 INFO:tasks.workunit.client.0.vm10.stderr:osd.0 belongs to no class, osd.1 belongs to no class, osd.2 belongs to no class, done removing class of osd(s): 2026-03-08T22:56:54.627 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:249: TEST_mon_classes: ceph osd crush set-device-class class_1 all 2026-03-08T22:56:54.824 INFO:tasks.workunit.client.0.vm10.stderr:osd.0 already set to class class_1. osd.1 already set to class class_1. osd.2 already set to class class_1. set osd(s) to class 'class_1' 2026-03-08T22:56:54.835 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:250: TEST_mon_classes: ceph osd crush class ls 2026-03-08T22:56:54.835 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:250: TEST_mon_classes: grep class_1 2026-03-08T22:56:54.950 INFO:tasks.workunit.client.0.vm10.stdout: "class_1" 2026-03-08T22:56:54.950 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:251: TEST_mon_classes: ceph osd crush tree --show-shadow 2026-03-08T22:56:54.950 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:251: TEST_mon_classes: grep class_1 2026-03-08T22:56:55.061 INFO:tasks.workunit.client.0.vm10.stdout:-20 class_1 0.19537 root default~class_1 2026-03-08T22:56:55.061 INFO:tasks.workunit.client.0.vm10.stdout:-19 class_1 0.19537 host vm10~class_1 2026-03-08T22:56:55.061 INFO:tasks.workunit.client.0.vm10.stdout: 0 class_1 0.09769 osd.0 2026-03-08T22:56:55.061 INFO:tasks.workunit.client.0.vm10.stdout: 1 class_1 0.09769 osd.1 2026-03-08T22:56:55.061 INFO:tasks.workunit.client.0.vm10.stdout:-18 class_1 0.09769 root foo~class_1 2026-03-08T22:56:55.061 INFO:tasks.workunit.client.0.vm10.stdout:-17 class_1 0.09769 rack foo-rack~class_1 2026-03-08T22:56:55.061 INFO:tasks.workunit.client.0.vm10.stdout:-16 class_1 0.09769 host foo-host~class_1 2026-03-08T22:56:55.061 INFO:tasks.workunit.client.0.vm10.stdout: 2 class_1 0.09769 osd.2 2026-03-08T22:56:55.061 INFO:tasks.workunit.client.0.vm10.stdout: 2 class_1 0.09769 osd.2 2026-03-08T22:56:55.061 INFO:tasks.workunit.client.0.vm10.stdout: 0 class_1 0.09769 osd.0 2026-03-08T22:56:55.061 INFO:tasks.workunit.client.0.vm10.stdout: 1 class_1 0.09769 osd.1 2026-03-08T22:56:55.061 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:252: TEST_mon_classes: ceph osd crush rule create-replicated class_1_rule default host class_1 2026-03-08T22:56:55.251 INFO:tasks.workunit.client.0.vm10.stderr:rule class_1_rule already exists 2026-03-08T22:56:55.259 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:253: TEST_mon_classes: ceph osd crush class rename class_1 class_2 2026-03-08T22:56:55.460 INFO:tasks.workunit.client.0.vm10.stderr:already renamed to 'class_2' 2026-03-08T22:56:55.469 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:254: TEST_mon_classes: ceph osd crush class rename class_1 class_2 2026-03-08T22:56:55.577 INFO:tasks.workunit.client.0.vm10.stderr:already renamed to 'class_2' 2026-03-08T22:56:55.585 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:255: TEST_mon_classes: ceph osd crush class ls 2026-03-08T22:56:55.585 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:255: TEST_mon_classes: grep class_1 2026-03-08T22:56:55.717 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:256: TEST_mon_classes: ceph osd crush tree --show-shadow 2026-03-08T22:56:55.717 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:256: TEST_mon_classes: grep class_1 2026-03-08T22:56:55.832 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:257: TEST_mon_classes: ceph osd crush class ls 2026-03-08T22:56:55.832 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:257: TEST_mon_classes: grep class_2 2026-03-08T22:56:55.955 INFO:tasks.workunit.client.0.vm10.stdout: "class_2" 2026-03-08T22:56:55.955 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:258: TEST_mon_classes: ceph osd crush tree --show-shadow 2026-03-08T22:56:55.955 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:258: TEST_mon_classes: grep class_2 2026-03-08T22:56:56.071 INFO:tasks.workunit.client.0.vm10.stdout:-20 class_2 0.19537 root default~class_2 2026-03-08T22:56:56.071 INFO:tasks.workunit.client.0.vm10.stdout:-19 class_2 0.19537 host vm10~class_2 2026-03-08T22:56:56.071 INFO:tasks.workunit.client.0.vm10.stdout: 0 class_2 0.09769 osd.0 2026-03-08T22:56:56.071 INFO:tasks.workunit.client.0.vm10.stdout: 1 class_2 0.09769 osd.1 2026-03-08T22:56:56.071 INFO:tasks.workunit.client.0.vm10.stdout:-18 class_2 0.09769 root foo~class_2 2026-03-08T22:56:56.071 INFO:tasks.workunit.client.0.vm10.stdout:-17 class_2 0.09769 rack foo-rack~class_2 2026-03-08T22:56:56.071 INFO:tasks.workunit.client.0.vm10.stdout:-16 class_2 0.09769 host foo-host~class_2 2026-03-08T22:56:56.071 INFO:tasks.workunit.client.0.vm10.stdout: 2 class_2 0.09769 osd.2 2026-03-08T22:56:56.071 INFO:tasks.workunit.client.0.vm10.stdout: 2 class_2 0.09769 osd.2 2026-03-08T22:56:56.071 INFO:tasks.workunit.client.0.vm10.stdout: 0 class_2 0.09769 osd.0 2026-03-08T22:56:56.071 INFO:tasks.workunit.client.0.vm10.stdout: 1 class_2 0.09769 osd.1 2026-03-08T22:56:56.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:37: run: teardown td/crush-classes 2026-03-08T22:56:56.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/crush-classes 2026-03-08T22:56:56.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:56:56.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/crush-classes KILL 2026-03-08T22:56:56.072 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:56:56.072 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:56:56.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:56:56.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:56:56.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:56:56.184 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:56:56.184 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:56:56.185 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:56:56.185 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:56:56.186 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:56:56.186 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:56:56.186 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:56:56.187 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:56:56.187 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:56:56.187 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:56:56.187 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:56:56.188 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:56:56.189 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:56:56.189 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/crush-classes 2026-03-08T22:56:56.204 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:56:56.204 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:56.204 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:56:56.204 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.66179 2026-03-08T22:56:56.205 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:56:56.205 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:56:56.205 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:34: run: for func in $funcs 2026-03-08T22:56:56.205 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:35: run: setup td/crush-classes 2026-03-08T22:56:56.205 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/crush-classes 2026-03-08T22:56:56.205 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/crush-classes 2026-03-08T22:56:56.205 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/crush-classes 2026-03-08T22:56:56.205 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:56:56.205 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/crush-classes KILL 2026-03-08T22:56:56.205 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:56:56.205 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:56:56.205 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:56:56.205 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:56:56.205 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:56:56.207 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:56:56.207 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:56:56.207 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:56:56.208 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:56:56.208 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:56:56.208 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:56:56.209 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:56:56.209 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:56:56.209 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:56:56.210 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:56:56.210 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:56:56.210 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:56:56.211 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:56:56.211 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/crush-classes 2026-03-08T22:56:56.212 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:56:56.212 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:56.212 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:56:56.212 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.66179 2026-03-08T22:56:56.213 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:56:56.213 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:56:56.213 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/crush-classes 2026-03-08T22:56:56.214 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:56:56.214 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:56.214 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:56:56.214 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.66179 2026-03-08T22:56:56.215 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:56:56.216 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:56:56.216 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:56:56.216 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/crush-classes 1' TERM HUP INT 2026-03-08T22:56:56.216 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:36: run: TEST_reweight_vs_classes td/crush-classes 2026-03-08T22:56:56.216 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:61: TEST_reweight_vs_classes: local dir=td/crush-classes 2026-03-08T22:56:56.216 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:66: TEST_reweight_vs_classes: run_mon td/crush-classes a 2026-03-08T22:56:56.216 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/crush-classes 2026-03-08T22:56:56.216 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:56:56.216 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:56:56.216 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:56:56.216 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/crush-classes/a 2026-03-08T22:56:56.216 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/crush-classes/a --run-dir=td/crush-classes 2026-03-08T22:56:56.239 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:56:56.239 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:56.239 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:56.239 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:56.239 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:56.239 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:56:56.239 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:56:56.240 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/crush-classes/a '--log-file=td/crush-classes/$name.log' '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --mon-cluster-log-file=td/crush-classes/log --run-dir=td/crush-classes '--pid-file=td/crush-classes/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:56:56.272 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:56:56.272 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:56:56.273 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:56:56.273 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:56:56.273 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:56:56.273 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:56:56.273 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:56:56.273 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:56:56.273 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:56:56.275 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:56:56.275 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:56.275 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:56:56.275 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.66179/ceph-mon.a.asok 2026-03-08T22:56:56.276 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:56:56.276 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.66179/ceph-mon.a.asok config get fsid 2026-03-08T22:56:56.329 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:56:56.329 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:56:56.329 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:56:56.329 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:56:56.329 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:56:56.329 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:56:56.329 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:56:56.329 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:56:56.329 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:56:56.329 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:56.329 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:56:56.330 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.66179/ceph-mon.a.asok 2026-03-08T22:56:56.330 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:56:56.330 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.66179/ceph-mon.a.asok config get mon_host 2026-03-08T22:56:56.384 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:67: TEST_reweight_vs_classes: run_osd td/crush-classes 0 2026-03-08T22:56:56.384 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-classes 2026-03-08T22:56:56.384 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:56:56.384 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:56:56.384 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:56:56.384 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-classes/0 2026-03-08T22:56:56.384 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-08T22:56:56.384 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:56:56.384 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:56:56.384 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:56:56.384 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-classes/0' 2026-03-08T22:56:56.384 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-classes/0/journal' 2026-03-08T22:56:56.384 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:56:56.385 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:56:56.385 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-08T22:56:56.385 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:56:56.385 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:56.385 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:56.385 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:56.385 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:56.385 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:56:56.385 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:56:56.385 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:56:56.385 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:56:56.385 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:56:56.385 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:56:56.386 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-08T22:56:56.386 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-08T22:56:56.386 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:56:56.386 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:56:56.386 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:56:56.386 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:56:56.386 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:56:56.386 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:56:56.386 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-classes/0 2026-03-08T22:56:56.386 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:56:56.387 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=62ae9a6f-4bc0-42e2-b2a4-e07363df4502 2026-03-08T22:56:56.387 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 62ae9a6f-4bc0-42e2-b2a4-e07363df4502' 2026-03-08T22:56:56.387 INFO:tasks.workunit.client.0.vm10.stdout:add osd0 62ae9a6f-4bc0-42e2-b2a4-e07363df4502 2026-03-08T22:56:56.387 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:56:56.401 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQC4/q1p/cfgFxAAChnEaVqg0YJ3yAc77oxoAA== 2026-03-08T22:56:56.401 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQC4/q1p/cfgFxAAChnEaVqg0YJ3yAc77oxoAA=="}' 2026-03-08T22:56:56.401 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 62ae9a6f-4bc0-42e2-b2a4-e07363df4502 -i td/crush-classes/0/new.json 2026-03-08T22:56:56.518 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:56:56.528 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-classes/0/new.json 2026-03-08T22:56:56.529 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/0 --osd-journal=td/crush-classes/0/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQC4/q1p/cfgFxAAChnEaVqg0YJ3yAc77oxoAA== --osd-uuid 62ae9a6f-4bc0-42e2-b2a4-e07363df4502 2026-03-08T22:56:56.549 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:56.550+0000 7f19e981e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:56.552 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:56.552+0000 7f19e981e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:56.553 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:56.553+0000 7f19e981e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:56.553 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:56.553+0000 7f19e981e780 -1 bdev(0x55e7a5b19c00 td/crush-classes/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:56:56.553 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:56.554+0000 7f19e981e780 -1 bluestore(td/crush-classes/0) _read_fsid unparsable uuid 2026-03-08T22:56:58.694 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-classes/0/keyring 2026-03-08T22:56:58.694 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:56:58.695 INFO:tasks.workunit.client.0.vm10.stdout:adding osd0 key to auth repository 2026-03-08T22:56:58.695 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:56:58.695 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-classes/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:56:58.809 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:56:58.810 INFO:tasks.workunit.client.0.vm10.stdout:start osd.0 2026-03-08T22:56:58.810 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/0 --osd-journal=td/crush-classes/0/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:56:58.810 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:56:58.811 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:56:58.812 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:56:58.828 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:58.828+0000 7fd2e1622780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:58.838 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:58.838+0000 7fd2e1622780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:58.840 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:58.839+0000 7fd2e1622780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:58.927 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:56:58.927 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:56:58.927 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:56:58.927 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:56:58.927 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:56:58.927 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:56:58.927 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:58.927 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:56:58.927 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:58.928 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:56:59.039 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:59.914 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:56:59.914+0000 7fd2e1622780 -1 Falling back to public interface 2026-03-08T22:57:00.041 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:00.041 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:00.041 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:57:00.041 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:57:00.042 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:00.042 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:57:00.150 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:01.030 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:01.030+0000 7fd2e1622780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:57:01.153 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:57:01.153 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:01.153 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:01.153 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:57:01.153 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:01.153 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:57:01.275 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:02.084 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:02.084+0000 7fd2dcff9640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:57:02.277 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:57:02.278 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:02.278 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:02.278 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:57:02.278 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:02.278 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:57:02.389 INFO:tasks.workunit.client.0.vm10.stdout:osd.0 up in weight 1 up_from 4 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6800/1406963288,v1:127.0.0.1:6801/1406963288] [v2:127.0.0.1:6802/1406963288,v1:127.0.0.1:6803/1406963288] exists,up 62ae9a6f-4bc0-42e2-b2a4-e07363df4502 2026-03-08T22:57:02.389 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:57:02.389 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:57:02.390 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:57:02.390 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:68: TEST_reweight_vs_classes: run_osd td/crush-classes 1 2026-03-08T22:57:02.390 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-classes 2026-03-08T22:57:02.390 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:57:02.390 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:57:02.390 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:57:02.390 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-classes/1 2026-03-08T22:57:02.390 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-08T22:57:02.390 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:57:02.390 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:57:02.390 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:57:02.390 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-classes/1' 2026-03-08T22:57:02.390 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-classes/1/journal' 2026-03-08T22:57:02.390 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:57:02.390 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:57:02.390 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-08T22:57:02.391 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:57:02.391 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:02.391 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:02.391 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:02.391 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:02.391 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:57:02.391 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:57:02.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:57:02.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:57:02.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:57:02.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:57:02.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-08T22:57:02.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-08T22:57:02.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:57:02.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:57:02.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:57:02.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:57:02.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:57:02.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:57:02.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-classes/1 2026-03-08T22:57:02.393 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:57:02.393 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=fc2c92d4-ebae-473b-9859-4e51d277cc49 2026-03-08T22:57:02.393 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 fc2c92d4-ebae-473b-9859-4e51d277cc49' 2026-03-08T22:57:02.393 INFO:tasks.workunit.client.0.vm10.stdout:add osd1 fc2c92d4-ebae-473b-9859-4e51d277cc49 2026-03-08T22:57:02.394 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:57:02.408 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQC+/q1pKSZNGBAAInRzHzmOK+DVqdrFJhuHVw== 2026-03-08T22:57:02.408 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQC+/q1pKSZNGBAAInRzHzmOK+DVqdrFJhuHVw=="}' 2026-03-08T22:57:02.408 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new fc2c92d4-ebae-473b-9859-4e51d277cc49 -i td/crush-classes/1/new.json 2026-03-08T22:57:02.530 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:57:02.538 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-classes/1/new.json 2026-03-08T22:57:02.539 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/1 --osd-journal=td/crush-classes/1/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQC+/q1pKSZNGBAAInRzHzmOK+DVqdrFJhuHVw== --osd-uuid fc2c92d4-ebae-473b-9859-4e51d277cc49 2026-03-08T22:57:02.560 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:02.560+0000 7fb5fb21a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:02.562 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:02.562+0000 7fb5fb21a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:02.563 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:02.563+0000 7fb5fb21a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:02.563 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:02.564+0000 7fb5fb21a780 -1 bdev(0x55acaaef1c00 td/crush-classes/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:57:02.563 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:02.564+0000 7fb5fb21a780 -1 bluestore(td/crush-classes/1) _read_fsid unparsable uuid 2026-03-08T22:57:05.463 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-classes/1/keyring 2026-03-08T22:57:05.463 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:57:05.464 INFO:tasks.workunit.client.0.vm10.stdout:adding osd1 key to auth repository 2026-03-08T22:57:05.464 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:57:05.464 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-classes/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:57:05.586 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:57:05.586 INFO:tasks.workunit.client.0.vm10.stdout:start osd.1 2026-03-08T22:57:05.587 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/1 --osd-journal=td/crush-classes/1/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:57:05.587 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:57:05.588 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:57:05.591 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:57:05.608 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:05.608+0000 7f2b30982780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:05.612 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:05.612+0000 7f2b30982780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:05.614 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:05.613+0000 7f2b30982780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:05.716 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:57:05.716 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:57:05.716 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:57:05.716 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:57:05.716 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:57:05.716 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:05.716 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:57:05.716 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:57:05.716 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:05.716 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:05.830 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:06.694 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:06.694+0000 7f2b30982780 -1 Falling back to public interface 2026-03-08T22:57:06.832 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:06.832 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:06.832 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:57:06.832 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:57:06.833 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:06.833 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:06.944 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:07.561 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:07.561+0000 7f2b30982780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:57:07.946 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:57:07.946 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:07.946 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:07.946 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:57:07.946 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:07.946 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:08.067 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:09.068 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:57:09.069 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:09.069 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:09.069 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:57:09.069 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:09.069 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:09.181 INFO:tasks.workunit.client.0.vm10.stdout:osd.1 up in weight 1 up_from 8 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6808/2173312074,v1:127.0.0.1:6809/2173312074] [v2:127.0.0.1:6810/2173312074,v1:127.0.0.1:6811/2173312074] exists,up fc2c92d4-ebae-473b-9859-4e51d277cc49 2026-03-08T22:57:09.181 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:57:09.181 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:57:09.181 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:57:09.181 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:69: TEST_reweight_vs_classes: run_osd td/crush-classes 2 2026-03-08T22:57:09.181 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-classes 2026-03-08T22:57:09.181 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:57:09.182 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:57:09.182 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:57:09.182 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-classes/2 2026-03-08T22:57:09.182 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-08T22:57:09.182 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:57:09.182 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:57:09.182 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:57:09.182 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-classes/2' 2026-03-08T22:57:09.182 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-classes/2/journal' 2026-03-08T22:57:09.182 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:57:09.182 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:57:09.182 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-08T22:57:09.182 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:57:09.182 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:09.182 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:09.182 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:09.182 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:09.182 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:57:09.182 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:57:09.183 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:57:09.183 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:57:09.183 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:57:09.183 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:57:09.183 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-08T22:57:09.183 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-08T22:57:09.183 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:57:09.183 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:57:09.183 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:57:09.183 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:57:09.183 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:57:09.183 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:57:09.183 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-classes/2 2026-03-08T22:57:09.184 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:57:09.185 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=48fc0063-6d6e-421e-a989-ed15a602c78a 2026-03-08T22:57:09.185 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 48fc0063-6d6e-421e-a989-ed15a602c78a' 2026-03-08T22:57:09.185 INFO:tasks.workunit.client.0.vm10.stdout:add osd2 48fc0063-6d6e-421e-a989-ed15a602c78a 2026-03-08T22:57:09.185 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:57:09.199 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDF/q1pt5rnCxAAqkwxDqh+ZAK0f45+lauGUg== 2026-03-08T22:57:09.199 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDF/q1pt5rnCxAAqkwxDqh+ZAK0f45+lauGUg=="}' 2026-03-08T22:57:09.199 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 48fc0063-6d6e-421e-a989-ed15a602c78a -i td/crush-classes/2/new.json 2026-03-08T22:57:09.324 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:57:09.332 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-classes/2/new.json 2026-03-08T22:57:09.333 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/2 --osd-journal=td/crush-classes/2/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDF/q1pt5rnCxAAqkwxDqh+ZAK0f45+lauGUg== --osd-uuid 48fc0063-6d6e-421e-a989-ed15a602c78a 2026-03-08T22:57:09.355 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:09.355+0000 7f36e021a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:09.357 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:09.357+0000 7f36e021a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:09.358 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:09.358+0000 7f36e021a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:09.358 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:09.359+0000 7f36e021a780 -1 bdev(0x5577670d7c00 td/crush-classes/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:57:09.358 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:09.359+0000 7f36e021a780 -1 bluestore(td/crush-classes/2) _read_fsid unparsable uuid 2026-03-08T22:57:11.996 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-classes/2/keyring 2026-03-08T22:57:11.996 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:57:11.996 INFO:tasks.workunit.client.0.vm10.stdout:adding osd2 key to auth repository 2026-03-08T22:57:11.997 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:57:11.997 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-classes/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:57:12.109 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:57:12.109 INFO:tasks.workunit.client.0.vm10.stdout:start osd.2 2026-03-08T22:57:12.109 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/2 --osd-journal=td/crush-classes/2/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:57:12.109 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:57:12.110 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:57:12.112 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:57:12.127 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:12.127+0000 7f3794919780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:12.136 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:12.137+0000 7f3794919780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:12.138 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:12.138+0000 7f3794919780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:12.227 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:57:12.227 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:57:12.227 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:57:12.227 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:57:12.227 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:57:12.227 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:12.227 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:57:12.227 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:57:12.227 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:12.227 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:12.339 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:12.984 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:12.984+0000 7f3794919780 -1 Falling back to public interface 2026-03-08T22:57:13.340 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:13.341 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:13.341 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:57:13.341 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:57:13.342 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:13.342 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:13.450 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:14.104 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:14.105+0000 7f3794919780 -1 osd.2 0 log_to_monitors true 2026-03-08T22:57:14.451 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:57:14.451 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:14.451 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:14.451 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:57:14.452 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:14.452 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:14.574 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:15.077 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:15.077+0000 7f37908bb640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T22:57:15.576 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:57:15.576 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:15.576 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:15.576 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:57:15.576 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:15.577 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:15.686 INFO:tasks.workunit.client.0.vm10.stdout:osd.2 up in weight 1 up_from 12 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6816/1814155778,v1:127.0.0.1:6817/1814155778] [v2:127.0.0.1:6818/1814155778,v1:127.0.0.1:6819/1814155778] exists,up 48fc0063-6d6e-421e-a989-ed15a602c78a 2026-03-08T22:57:15.686 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:57:15.686 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:57:15.686 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:57:15.686 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:71: TEST_reweight_vs_classes: ceph osd crush set-device-class ssd osd.0 2026-03-08T22:57:15.915 INFO:tasks.workunit.client.0.vm10.stderr:osd.0 already set to class ssd. set-device-class item id 0 name 'osd.0' device_class 'ssd': no change. set osd(s) to class 'ssd' 2026-03-08T22:57:15.925 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:72: TEST_reweight_vs_classes: ceph osd crush class ls-osd ssd 2026-03-08T22:57:15.925 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:72: TEST_reweight_vs_classes: grep 0 2026-03-08T22:57:16.041 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:57:16.041 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:73: TEST_reweight_vs_classes: ceph osd crush set-device-class ssd osd.1 2026-03-08T22:57:16.325 INFO:tasks.workunit.client.0.vm10.stderr:osd.1 already set to class ssd. set-device-class item id 1 name 'osd.1' device_class 'ssd': no change. set osd(s) to class 'ssd' 2026-03-08T22:57:16.334 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:74: TEST_reweight_vs_classes: ceph osd crush class ls-osd ssd 2026-03-08T22:57:16.334 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:74: TEST_reweight_vs_classes: grep 1 2026-03-08T22:57:16.442 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:57:16.443 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:76: TEST_reweight_vs_classes: ceph osd crush reweight osd.0 1 2026-03-08T22:57:16.634 INFO:tasks.workunit.client.0.vm10.stderr:reweighted item id 0 name 'osd.0' to 1 in crush map 2026-03-08T22:57:16.643 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:78: TEST_reweight_vs_classes: hostname -s 2026-03-08T22:57:16.644 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:78: TEST_reweight_vs_classes: h=vm10 2026-03-08T22:57:16.644 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:79: TEST_reweight_vs_classes: ceph osd crush dump 2026-03-08T22:57:16.644 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:79: TEST_reweight_vs_classes: grep 65536 2026-03-08T22:57:16.645 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:79: TEST_reweight_vs_classes: jq '.buckets[] | select(.name=="vm10") | .items[0].weight' 2026-03-08T22:57:16.752 INFO:tasks.workunit.client.0.vm10.stdout:65536 2026-03-08T22:57:16.752 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:80: TEST_reweight_vs_classes: ceph osd crush dump 2026-03-08T22:57:16.752 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:80: TEST_reweight_vs_classes: grep 65536 2026-03-08T22:57:16.753 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:80: TEST_reweight_vs_classes: jq '.buckets[] | select(.name=="vm10~ssd") | .items[0].weight' 2026-03-08T22:57:16.862 INFO:tasks.workunit.client.0.vm10.stdout:65536 2026-03-08T22:57:16.863 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:82: TEST_reweight_vs_classes: ceph osd crush set 0 2 host=vm10 2026-03-08T22:57:17.064 INFO:tasks.workunit.client.0.vm10.stderr:set item id 0 name 'osd.0' weight 2 at location {host=vm10}: no change 2026-03-08T22:57:17.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:84: TEST_reweight_vs_classes: ceph osd crush dump 2026-03-08T22:57:17.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:84: TEST_reweight_vs_classes: grep 131072 2026-03-08T22:57:17.073 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:84: TEST_reweight_vs_classes: jq '.buckets[] | select(.name=="vm10") | .items[0].weight' 2026-03-08T22:57:17.183 INFO:tasks.workunit.client.0.vm10.stdout:131072 2026-03-08T22:57:17.184 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:85: TEST_reweight_vs_classes: ceph osd crush dump 2026-03-08T22:57:17.184 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:85: TEST_reweight_vs_classes: grep 131072 2026-03-08T22:57:17.185 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:85: TEST_reweight_vs_classes: jq '.buckets[] | select(.name=="vm10~ssd") | .items[0].weight' 2026-03-08T22:57:17.299 INFO:tasks.workunit.client.0.vm10.stdout:131072 2026-03-08T22:57:17.300 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:37: run: teardown td/crush-classes 2026-03-08T22:57:17.300 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/crush-classes 2026-03-08T22:57:17.300 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:57:17.300 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/crush-classes KILL 2026-03-08T22:57:17.300 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:57:17.300 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:57:17.300 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:57:17.300 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:57:17.300 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:57:17.415 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:57:17.415 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:57:17.415 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:57:17.415 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:57:17.416 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:57:17.416 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:57:17.416 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:57:17.417 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:57:17.417 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:57:17.417 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:57:17.418 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:57:17.419 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:57:17.420 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:57:17.420 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/crush-classes 2026-03-08T22:57:17.434 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:57:17.434 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:17.434 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:57:17.434 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.66179 2026-03-08T22:57:17.435 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:57:17.435 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:57:17.435 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:34: run: for func in $funcs 2026-03-08T22:57:17.435 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:35: run: setup td/crush-classes 2026-03-08T22:57:17.435 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/crush-classes 2026-03-08T22:57:17.435 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/crush-classes 2026-03-08T22:57:17.435 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/crush-classes 2026-03-08T22:57:17.435 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:57:17.435 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/crush-classes KILL 2026-03-08T22:57:17.435 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:57:17.435 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:57:17.436 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:57:17.436 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:57:17.436 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:57:17.437 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:57:17.438 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:57:17.438 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:57:17.438 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:57:17.439 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:57:17.439 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:57:17.439 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:57:17.440 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:57:17.440 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:57:17.440 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:57:17.441 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:57:17.441 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:57:17.442 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:57:17.442 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/crush-classes 2026-03-08T22:57:17.443 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:57:17.443 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:17.443 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:57:17.443 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.66179 2026-03-08T22:57:17.444 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:57:17.444 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:57:17.444 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/crush-classes 2026-03-08T22:57:17.445 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:57:17.445 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:17.445 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:57:17.445 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.66179 2026-03-08T22:57:17.446 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:57:17.446 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:57:17.446 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:57:17.446 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/crush-classes 1' TERM HUP INT 2026-03-08T22:57:17.446 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:36: run: TEST_set_device_class td/crush-classes 2026-03-08T22:57:17.446 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:142: TEST_set_device_class: local dir=td/crush-classes 2026-03-08T22:57:17.447 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:144: TEST_set_device_class: TEST_classes td/crush-classes 2026-03-08T22:57:17.447 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:89: TEST_classes: local dir=td/crush-classes 2026-03-08T22:57:17.447 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:91: TEST_classes: run_mon td/crush-classes a 2026-03-08T22:57:17.447 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/crush-classes 2026-03-08T22:57:17.447 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:57:17.447 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:57:17.447 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:57:17.447 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/crush-classes/a 2026-03-08T22:57:17.447 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/crush-classes/a --run-dir=td/crush-classes 2026-03-08T22:57:17.471 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:57:17.471 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:17.471 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:17.471 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:17.471 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:17.471 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:57:17.472 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:57:17.472 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/crush-classes/a '--log-file=td/crush-classes/$name.log' '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --mon-cluster-log-file=td/crush-classes/log --run-dir=td/crush-classes '--pid-file=td/crush-classes/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:57:17.502 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:57:17.503 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:57:17.504 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:57:17.504 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:57:17.504 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:57:17.504 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:57:17.504 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:57:17.504 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:57:17.505 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:57:17.505 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:17.505 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:57:17.505 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.66179/ceph-mon.a.asok 2026-03-08T22:57:17.505 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:57:17.505 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.66179/ceph-mon.a.asok config get fsid 2026-03-08T22:57:17.506 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:57:17.555 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:57:17.555 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:57:17.555 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:57:17.555 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:57:17.555 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:57:17.555 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:57:17.555 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:57:17.555 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:57:17.555 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:57:17.555 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:17.555 INFO:tasks.workunit.client.0.vm10.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:57:17.556 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.66179/ceph-mon.a.asok 2026-03-08T22:57:17.556 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:57:17.556 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.66179/ceph-mon.a.asok config get mon_host 2026-03-08T22:57:17.611 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:92: TEST_classes: run_osd td/crush-classes 0 2026-03-08T22:57:17.611 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-classes 2026-03-08T22:57:17.611 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:57:17.611 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:57:17.611 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:57:17.611 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-classes/0 2026-03-08T22:57:17.611 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-08T22:57:17.611 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:57:17.611 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:57:17.611 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:57:17.611 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-classes/0' 2026-03-08T22:57:17.611 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-classes/0/journal' 2026-03-08T22:57:17.612 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:57:17.612 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:57:17.612 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-08T22:57:17.612 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:57:17.612 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:17.612 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:17.613 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:17.613 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:17.613 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:57:17.613 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:57:17.614 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:57:17.614 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:57:17.614 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:57:17.614 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:57:17.614 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-08T22:57:17.614 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-08T22:57:17.614 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:57:17.614 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:57:17.614 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:57:17.614 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:57:17.614 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:57:17.614 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:57:17.614 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-classes/0 2026-03-08T22:57:17.615 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:57:17.616 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=a9dd7b0a-4ed9-4a2f-a48a-690b5655b137 2026-03-08T22:57:17.616 INFO:tasks.workunit.client.0.vm10.stdout:add osd0 a9dd7b0a-4ed9-4a2f-a48a-690b5655b137 2026-03-08T22:57:17.616 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 a9dd7b0a-4ed9-4a2f-a48a-690b5655b137' 2026-03-08T22:57:17.616 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:57:17.628 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDN/q1pd7B2JRAADYKadZwJUzsBJYdXLgmXjQ== 2026-03-08T22:57:17.628 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDN/q1pd7B2JRAADYKadZwJUzsBJYdXLgmXjQ=="}' 2026-03-08T22:57:17.628 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new a9dd7b0a-4ed9-4a2f-a48a-690b5655b137 -i td/crush-classes/0/new.json 2026-03-08T22:57:17.743 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:57:17.751 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-classes/0/new.json 2026-03-08T22:57:17.752 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/0 --osd-journal=td/crush-classes/0/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDN/q1pd7B2JRAADYKadZwJUzsBJYdXLgmXjQ== --osd-uuid a9dd7b0a-4ed9-4a2f-a48a-690b5655b137 2026-03-08T22:57:17.773 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:17.774+0000 7f8c07827780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:17.775 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:17.776+0000 7f8c07827780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:17.776 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:17.777+0000 7f8c07827780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:17.777 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:17.777+0000 7f8c07827780 -1 bdev(0x563e9cc5fc00 td/crush-classes/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:57:17.777 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:17.777+0000 7f8c07827780 -1 bluestore(td/crush-classes/0) _read_fsid unparsable uuid 2026-03-08T22:57:19.921 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-classes/0/keyring 2026-03-08T22:57:19.921 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:57:19.922 INFO:tasks.workunit.client.0.vm10.stdout:adding osd0 key to auth repository 2026-03-08T22:57:19.922 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:57:19.922 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-classes/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:57:20.034 INFO:tasks.workunit.client.0.vm10.stdout:start osd.0 2026-03-08T22:57:20.035 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:57:20.035 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/0 --osd-journal=td/crush-classes/0/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:57:20.035 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:57:20.036 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:57:20.037 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:57:20.052 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:20.052+0000 7f1d3e7f9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:20.057 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:20.058+0000 7f1d3e7f9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:20.060 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:20.059+0000 7f1d3e7f9780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:20.143 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:57:20.143 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:57:20.143 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:57:20.143 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:57:20.143 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:57:20.143 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:20.143 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:57:20.143 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:57:20.143 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:20.143 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:57:20.257 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:20.627 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:20.627+0000 7f1d3e7f9780 -1 Falling back to public interface 2026-03-08T22:57:21.259 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:21.259 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:21.259 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:57:21.259 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:57:21.260 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:21.260 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:57:21.368 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:21.486 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:21.487+0000 7f1d3e7f9780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:57:22.371 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:57:22.372 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:22.372 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:22.372 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:57:22.372 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:22.372 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:57:22.489 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:22.684 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:22.684+0000 7f1d3a79b640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:57:23.491 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:57:23.491 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:23.491 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:23.491 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:57:23.491 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:23.491 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:57:23.601 INFO:tasks.workunit.client.0.vm10.stdout:osd.0 up in weight 1 up_from 4 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6800/3482862664,v1:127.0.0.1:6801/3482862664] [v2:127.0.0.1:6802/3482862664,v1:127.0.0.1:6803/3482862664] exists,up a9dd7b0a-4ed9-4a2f-a48a-690b5655b137 2026-03-08T22:57:23.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:57:23.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:57:23.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:57:23.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:93: TEST_classes: run_osd td/crush-classes 1 2026-03-08T22:57:23.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-classes 2026-03-08T22:57:23.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:57:23.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:57:23.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:57:23.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-classes/1 2026-03-08T22:57:23.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-08T22:57:23.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:57:23.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:57:23.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:57:23.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-classes/1' 2026-03-08T22:57:23.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-classes/1/journal' 2026-03-08T22:57:23.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:57:23.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:57:23.602 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-08T22:57:23.603 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:57:23.603 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:23.603 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:23.603 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:23.603 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:23.603 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:57:23.603 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:57:23.603 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:57:23.603 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:57:23.603 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:57:23.603 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:57:23.603 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-08T22:57:23.603 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-08T22:57:23.603 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:57:23.603 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:57:23.603 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:57:23.603 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:57:23.603 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:57:23.603 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:57:23.603 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-classes/1 2026-03-08T22:57:23.604 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:57:23.605 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=44538c3a-2215-4222-977c-5c08d69cc4e1 2026-03-08T22:57:23.605 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 44538c3a-2215-4222-977c-5c08d69cc4e1' 2026-03-08T22:57:23.605 INFO:tasks.workunit.client.0.vm10.stdout:add osd1 44538c3a-2215-4222-977c-5c08d69cc4e1 2026-03-08T22:57:23.605 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:57:23.619 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDT/q1pf3DvJBAA2V3Rrtk8vYfGiviA0jWSrg== 2026-03-08T22:57:23.619 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDT/q1pf3DvJBAA2V3Rrtk8vYfGiviA0jWSrg=="}' 2026-03-08T22:57:23.620 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 44538c3a-2215-4222-977c-5c08d69cc4e1 -i td/crush-classes/1/new.json 2026-03-08T22:57:23.735 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:57:23.743 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-classes/1/new.json 2026-03-08T22:57:23.744 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/1 --osd-journal=td/crush-classes/1/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDT/q1pf3DvJBAA2V3Rrtk8vYfGiviA0jWSrg== --osd-uuid 44538c3a-2215-4222-977c-5c08d69cc4e1 2026-03-08T22:57:23.762 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:23.762+0000 7effad910780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:23.763 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:23.764+0000 7effad910780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:23.764 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:23.765+0000 7effad910780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:23.765 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:23.765+0000 7effad910780 -1 bdev(0x55bbe1431c00 td/crush-classes/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:57:23.765 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:23.765+0000 7effad910780 -1 bluestore(td/crush-classes/1) _read_fsid unparsable uuid 2026-03-08T22:57:26.391 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-classes/1/keyring 2026-03-08T22:57:26.392 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:57:26.393 INFO:tasks.workunit.client.0.vm10.stdout:adding osd1 key to auth repository 2026-03-08T22:57:26.393 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:57:26.393 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-classes/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:57:26.504 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:57:26.504 INFO:tasks.workunit.client.0.vm10.stdout:start osd.1 2026-03-08T22:57:26.504 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/1 --osd-journal=td/crush-classes/1/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:57:26.504 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:57:26.505 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:57:26.507 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:57:26.521 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:26.520+0000 7f40f263f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:26.523 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:26.524+0000 7f40f263f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:26.524 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:26.524+0000 7f40f263f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:26.617 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:57:26.618 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:57:26.618 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:57:26.618 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:57:26.618 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:57:26.618 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:26.618 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:57:26.618 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:57:26.618 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:26.618 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:26.726 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:27.635 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:27.635+0000 7f40f263f780 -1 Falling back to public interface 2026-03-08T22:57:27.727 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:27.727 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:27.727 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:57:27.727 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:57:27.728 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:27.728 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:27.842 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:28.764 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:28.763+0000 7f40f263f780 -1 osd.1 0 log_to_monitors true 2026-03-08T22:57:28.844 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:57:28.844 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:28.844 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:28.845 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:57:28.845 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:28.845 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:28.959 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:29.962 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:57:29.962 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:29.962 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:29.962 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:57:29.962 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:29.962 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:30.070 INFO:tasks.workunit.client.0.vm10.stdout:osd.1 up in weight 1 up_from 8 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6808/797630271,v1:127.0.0.1:6809/797630271] [v2:127.0.0.1:6810/797630271,v1:127.0.0.1:6811/797630271] exists,up 44538c3a-2215-4222-977c-5c08d69cc4e1 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:94: TEST_classes: run_osd td/crush-classes 2 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/crush-classes 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/crush-classes/2 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/crush-classes/2' 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/crush-classes/2/journal' 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:30.071 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:30.072 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:30.072 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:57:30.072 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:57:30.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' 2026-03-08T22:57:30.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:57:30.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:57:30.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:57:30.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-08T22:57:30.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-08T22:57:30.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:57:30.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:57:30.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:57:30.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:57:30.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:57:30.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:57:30.072 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/crush-classes/2 2026-03-08T22:57:30.074 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:57:30.074 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=4d145b8f-06b9-4317-8495-5b369b99cfb1 2026-03-08T22:57:30.075 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 4d145b8f-06b9-4317-8495-5b369b99cfb1' 2026-03-08T22:57:30.075 INFO:tasks.workunit.client.0.vm10.stdout:add osd2 4d145b8f-06b9-4317-8495-5b369b99cfb1 2026-03-08T22:57:30.075 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:57:30.088 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDa/q1pCaI6BRAA3OKuVz3S9p7tarb4TGqElQ== 2026-03-08T22:57:30.088 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDa/q1pCaI6BRAA3OKuVz3S9p7tarb4TGqElQ=="}' 2026-03-08T22:57:30.088 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 4d145b8f-06b9-4317-8495-5b369b99cfb1 -i td/crush-classes/2/new.json 2026-03-08T22:57:30.211 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:57:30.220 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/crush-classes/2/new.json 2026-03-08T22:57:30.221 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/2 --osd-journal=td/crush-classes/2/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDa/q1pCaI6BRAA3OKuVz3S9p7tarb4TGqElQ== --osd-uuid 4d145b8f-06b9-4317-8495-5b369b99cfb1 2026-03-08T22:57:30.240 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:30.240+0000 7fca32c10780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:30.244 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:30.243+0000 7fca32c10780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:30.245 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:30.246+0000 7fca32c10780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:30.246 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:30.246+0000 7fca32c10780 -1 bdev(0x55ae865d3c00 td/crush-classes/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:57:30.246 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:30.246+0000 7fca32c10780 -1 bluestore(td/crush-classes/2) _read_fsid unparsable uuid 2026-03-08T22:57:32.892 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/crush-classes/2/keyring 2026-03-08T22:57:32.892 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:57:32.893 INFO:tasks.workunit.client.0.vm10.stdout:adding osd2 key to auth repository 2026-03-08T22:57:32.893 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:57:32.893 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/crush-classes/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:57:33.006 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:57:33.006 INFO:tasks.workunit.client.0.vm10.stdout:start osd.2 2026-03-08T22:57:33.006 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=1f583fc4-61aa-4c5d-82f4-5751c3c37144 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/2 --osd-journal=td/crush-classes/2/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.66179/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:57:33.006 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:57:33.007 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:57:33.008 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:57:33.023 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:33.022+0000 7f447143f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:33.025 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:33.025+0000 7f447143f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:33.027 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:33.026+0000 7f447143f780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:33.115 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:57:33.115 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:57:33.115 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:57:33.115 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:57:33.115 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:57:33.115 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:33.115 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:57:33.115 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:57:33.115 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:33.116 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:33.215 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:34.216 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:34.217 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:34.217 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:57:34.217 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:57:34.217 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:34.217 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:34.324 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:34.365 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:34.364+0000 7f447143f780 -1 Falling back to public interface 2026-03-08T22:57:35.231 INFO:tasks.workunit.client.0.vm10.stderr:2026-03-08T22:57:35.231+0000 7f447143f780 -1 osd.2 0 log_to_monitors true 2026-03-08T22:57:35.325 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:35.326 INFO:tasks.workunit.client.0.vm10.stdout:2 2026-03-08T22:57:35.326 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:35.326 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:57:35.326 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:35.326 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:35.437 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:36.438 INFO:tasks.workunit.client.0.vm10.stdout:3 2026-03-08T22:57:36.438 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:36.438 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:36.438 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:57:36.439 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:36.439 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:36.552 INFO:tasks.workunit.client.0.vm10.stdout:osd.2 up in weight 1 up_from 12 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6816/4203334486,v1:127.0.0.1:6817/4203334486] [v2:127.0.0.1:6818/4203334486,v1:127.0.0.1:6819/4203334486] exists,up 4d145b8f-06b9-4317-8495-5b369b99cfb1 2026-03-08T22:57:36.552 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:57:36.552 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:57:36.552 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:57:36.552 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:95: TEST_classes: create_rbd_pool 2026-03-08T22:57:36.552 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T22:57:36.652 INFO:tasks.workunit.client.0.vm10.stderr:pool 'rbd' does not exist 2026-03-08T22:57:36.659 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T22:57:36.659 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:57:36.814 INFO:tasks.workunit.client.0.vm10.stderr:pool 'rbd' already exists 2026-03-08T22:57:36.823 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:57:37.825 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T22:57:38.118 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:97: TEST_classes: get_osds_up rbd SOMETHING 2026-03-08T22:57:38.118 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:51: get_osds_up: local poolname=rbd 2026-03-08T22:57:38.118 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:52: get_osds_up: local objectname=SOMETHING 2026-03-08T22:57:38.118 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: ceph --format xml osd map rbd SOMETHING 2026-03-08T22:57:38.118 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: xmlstarlet sel -t -m //up/osd -v . -o ' ' 2026-03-08T22:57:38.234 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: local 'osds=1 2 0 ' 2026-03-08T22:57:38.234 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:57: get_osds_up: echo 1 2 0 2026-03-08T22:57:38.234 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:97: TEST_classes: test '1 2 0' == '1 2 0' 2026-03-08T22:57:38.234 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:98: TEST_classes: add_something td/crush-classes SOMETHING 2026-03-08T22:57:38.234 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:42: add_something: local dir=td/crush-classes 2026-03-08T22:57:38.234 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:43: add_something: local obj=SOMETHING 2026-03-08T22:57:38.234 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:45: add_something: local payload=ABCDEF 2026-03-08T22:57:38.234 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:46: add_something: echo ABCDEF 2026-03-08T22:57:38.234 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:47: add_something: rados --pool rbd put SOMETHING td/crush-classes/ORIGINAL 2026-03-08T22:57:38.258 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:104: TEST_classes: ceph osd getcrushmap 2026-03-08T22:57:38.368 INFO:tasks.workunit.client.0.vm10.stderr:4 2026-03-08T22:57:38.376 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:105: TEST_classes: crushtool -d td/crush-classes/map -o td/crush-classes/map.txt 2026-03-08T22:57:38.388 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:106: TEST_classes: sed -i -e '/device 0 osd.0/s/$/ class ssd/' -e '/step take default/s/$/ class ssd/' td/crush-classes/map.txt 2026-03-08T22:57:38.389 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:110: TEST_classes: crushtool -c td/crush-classes/map.txt -o td/crush-classes/map-new 2026-03-08T22:57:38.401 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:111: TEST_classes: ceph osd setcrushmap -i td/crush-classes/map-new 2026-03-08T22:57:38.614 INFO:tasks.workunit.client.0.vm10.stderr:6 2026-03-08T22:57:38.629 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:117: TEST_classes: ok=false 2026-03-08T22:57:38.629 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:118: TEST_classes: for delay in 2 4 8 16 32 64 128 256 2026-03-08T22:57:38.629 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:119: TEST_classes: get_osds_up rbd SOMETHING_ELSE 2026-03-08T22:57:38.629 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:51: get_osds_up: local poolname=rbd 2026-03-08T22:57:38.629 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:52: get_osds_up: local objectname=SOMETHING_ELSE 2026-03-08T22:57:38.629 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: ceph --format xml osd map rbd SOMETHING_ELSE 2026-03-08T22:57:38.630 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: xmlstarlet sel -t -m //up/osd -v . -o ' ' 2026-03-08T22:57:38.744 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: local 'osds=0 ' 2026-03-08T22:57:38.744 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:57: get_osds_up: echo 0 2026-03-08T22:57:38.745 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:119: TEST_classes: test 0 == 0 2026-03-08T22:57:38.745 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:120: TEST_classes: ok=true 2026-03-08T22:57:38.745 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:121: TEST_classes: break 2026-03-08T22:57:38.745 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:127: TEST_classes: true 2026-03-08T22:57:38.745 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:132: TEST_classes: add_something td/crush-classes SOMETHING_ELSE 2026-03-08T22:57:38.745 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:42: add_something: local dir=td/crush-classes 2026-03-08T22:57:38.745 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:43: add_something: local obj=SOMETHING_ELSE 2026-03-08T22:57:38.745 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:45: add_something: local payload=ABCDEF 2026-03-08T22:57:38.745 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:46: add_something: echo ABCDEF 2026-03-08T22:57:38.745 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:47: add_something: rados --pool rbd put SOMETHING_ELSE td/crush-classes/ORIGINAL 2026-03-08T22:57:38.774 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:138: TEST_classes: ceph osd crush dump 2026-03-08T22:57:38.774 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:138: TEST_classes: grep -q '~ssd' 2026-03-08T22:57:38.886 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:146: TEST_set_device_class: ceph osd crush set-device-class ssd osd.0 2026-03-08T22:57:39.133 INFO:tasks.workunit.client.0.vm10.stderr:osd.0 already set to class ssd. set-device-class item id 0 name 'osd.0' device_class 'ssd': no change. set osd(s) to class 'ssd' 2026-03-08T22:57:39.141 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:147: TEST_set_device_class: ceph osd crush class ls-osd ssd 2026-03-08T22:57:39.141 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:147: TEST_set_device_class: grep 0 2026-03-08T22:57:39.250 INFO:tasks.workunit.client.0.vm10.stdout:0 2026-03-08T22:57:39.250 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:148: TEST_set_device_class: ceph osd crush set-device-class ssd osd.1 2026-03-08T22:57:39.487 INFO:tasks.workunit.client.0.vm10.stderr:osd.1 already set to class ssd. set-device-class item id 1 name 'osd.1' device_class 'ssd': no change. set osd(s) to class 'ssd' 2026-03-08T22:57:39.500 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:149: TEST_set_device_class: ceph osd crush class ls-osd ssd 2026-03-08T22:57:39.500 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:149: TEST_set_device_class: grep 1 2026-03-08T22:57:39.615 INFO:tasks.workunit.client.0.vm10.stdout:1 2026-03-08T22:57:39.615 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:150: TEST_set_device_class: ceph osd crush set-device-class ssd 0 1 2026-03-08T22:57:39.848 INFO:tasks.workunit.client.0.vm10.stderr:osd.0 already set to class ssd. set-device-class item id 0 name 'osd.0' device_class 'ssd': no change. osd.1 already set to class ssd. set-device-class item id 1 name 'osd.1' device_class 'ssd': no change. set osd(s) to class 'ssd' 2026-03-08T22:57:39.857 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:152: TEST_set_device_class: ok=false 2026-03-08T22:57:39.857 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:153: TEST_set_device_class: for delay in 2 4 8 16 32 64 128 256 2026-03-08T22:57:39.857 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:154: TEST_set_device_class: get_osds_up rbd SOMETHING_ELSE 2026-03-08T22:57:39.857 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:51: get_osds_up: local poolname=rbd 2026-03-08T22:57:39.857 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:52: get_osds_up: local objectname=SOMETHING_ELSE 2026-03-08T22:57:39.857 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: ceph --format xml osd map rbd SOMETHING_ELSE 2026-03-08T22:57:39.857 INFO:tasks.workunit.client.0.vm10.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: xmlstarlet sel -t -m //up/osd -v . -o ' ' 2026-03-08T22:57:39.963 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: local 'osds=0 1 ' 2026-03-08T22:57:39.963 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:57: get_osds_up: echo 0 1 2026-03-08T22:57:39.963 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:154: TEST_set_device_class: test '0 1' == '0 1' 2026-03-08T22:57:39.963 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:155: TEST_set_device_class: ok=true 2026-03-08T22:57:39.963 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:156: TEST_set_device_class: break 2026-03-08T22:57:39.964 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:163: TEST_set_device_class: true 2026-03-08T22:57:39.964 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:37: run: teardown td/crush-classes 2026-03-08T22:57:39.964 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/crush-classes 2026-03-08T22:57:39.964 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:57:39.964 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/crush-classes KILL 2026-03-08T22:57:39.964 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:57:39.964 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:57:39.964 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:57:39.964 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:57:39.964 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:57:40.075 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:57:40.075 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:57:40.076 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:57:40.076 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:57:40.077 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:57:40.077 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:57:40.077 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:57:40.078 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:57:40.078 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:57:40.078 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:57:40.078 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:57:40.079 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:57:40.079 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:57:40.079 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/crush-classes 2026-03-08T22:57:40.092 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:57:40.092 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:40.092 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:57:40.092 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.66179 2026-03-08T22:57:40.093 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:57:40.093 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:57:40.093 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2377: main: code=0 2026-03-08T22:57:40.093 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2381: main: teardown td/crush-classes 0 2026-03-08T22:57:40.094 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/crush-classes 2026-03-08T22:57:40.094 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs=0 2026-03-08T22:57:40.094 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/crush-classes KILL 2026-03-08T22:57:40.094 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:57:40.094 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:57:40.094 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:57:40.094 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:57:40.094 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:57:40.095 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:57:40.096 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:57:40.096 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:57:40.097 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:57:40.097 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:57:40.097 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:57:40.098 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:57:40.098 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:57:40.098 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:57:40.099 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:57:40.099 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:57:40.099 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:57:40.100 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o 0 = 1 ']' 2026-03-08T22:57:40.100 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/crush-classes 2026-03-08T22:57:40.101 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:57:40.101 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:40.101 INFO:tasks.workunit.client.0.vm10.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.66179 2026-03-08T22:57:40.101 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.66179 2026-03-08T22:57:40.102 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:57:40.102 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:57:40.102 INFO:tasks.workunit.client.0.vm10.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2382: main: return 0 2026-03-08T22:57:40.103 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-08T22:57:40.103 DEBUG:teuthology.orchestra.run.vm10:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-08T22:57:40.167 INFO:tasks.workunit:Stopping ['crush'] on client.0... 2026-03-08T22:57:40.167 DEBUG:teuthology.orchestra.run.vm10:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-08T22:57:40.565 DEBUG:teuthology.parallel:result is None 2026-03-08T22:57:40.566 DEBUG:teuthology.orchestra.run.vm10:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-08T22:57:40.587 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-08T22:57:40.587 DEBUG:teuthology.orchestra.run.vm10:> rmdir -- /home/ubuntu/cephtest/mnt.0 2026-03-08T22:57:40.644 INFO:tasks.workunit:Deleted artificial mount point /home/ubuntu/cephtest/mnt.0/client.0 2026-03-08T22:57:40.644 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-08T22:57:40.646 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-08T22:57:40.646 DEBUG:teuthology.orchestra.run.vm10:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-08T22:57:40.718 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, ceph-volume, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-08T22:57:40.718 DEBUG:teuthology.orchestra.run.vm10:> 2026-03-08T22:57:40.718 DEBUG:teuthology.orchestra.run.vm10:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse ceph-volume librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-08T22:57:40.718 DEBUG:teuthology.orchestra.run.vm10:> sudo yum -y remove $d || true 2026-03-08T22:57:40.718 DEBUG:teuthology.orchestra.run.vm10:> done 2026-03-08T22:57:40.916 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:40.916 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:40.916 INFO:teuthology.orchestra.run.vm10.stdout: Package Arch Version Repository Size 2026-03-08T22:57:40.916 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:40.916 INFO:teuthology.orchestra.run.vm10.stdout:Removing: 2026-03-08T22:57:40.916 INFO:teuthology.orchestra.run.vm10.stdout: ceph-radosgw x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 39 M 2026-03-08T22:57:40.916 INFO:teuthology.orchestra.run.vm10.stdout:Removing unused dependencies: 2026-03-08T22:57:40.916 INFO:teuthology.orchestra.run.vm10.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-08T22:57:40.916 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:40.916 INFO:teuthology.orchestra.run.vm10.stdout:Transaction Summary 2026-03-08T22:57:40.916 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:40.916 INFO:teuthology.orchestra.run.vm10.stdout:Remove 2 Packages 2026-03-08T22:57:40.916 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:40.916 INFO:teuthology.orchestra.run.vm10.stdout:Freed space: 39 M 2026-03-08T22:57:40.916 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction check 2026-03-08T22:57:40.918 INFO:teuthology.orchestra.run.vm10.stdout:Transaction check succeeded. 2026-03-08T22:57:40.918 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction test 2026-03-08T22:57:40.931 INFO:teuthology.orchestra.run.vm10.stdout:Transaction test succeeded. 2026-03-08T22:57:40.932 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction 2026-03-08T22:57:40.963 INFO:teuthology.orchestra.run.vm10.stdout: Preparing : 1/1 2026-03-08T22:57:40.981 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 1/2 2026-03-08T22:57:40.981 INFO:teuthology.orchestra.run.vm10.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:57:40.981 INFO:teuthology.orchestra.run.vm10.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-08T22:57:40.981 INFO:teuthology.orchestra.run.vm10.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-08T22:57:40.981 INFO:teuthology.orchestra.run.vm10.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-08T22:57:40.981 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:40.984 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 1/2 2026-03-08T22:57:40.992 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 1/2 2026-03-08T22:57:41.006 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-08T22:57:41.070 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-08T22:57:41.070 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 1/2 2026-03-08T22:57:41.113 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-08T22:57:41.113 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:41.113 INFO:teuthology.orchestra.run.vm10.stdout:Removed: 2026-03-08T22:57:41.113 INFO:teuthology.orchestra.run.vm10.stdout: ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-08T22:57:41.113 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:41.113 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:41.303 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:41.303 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:41.303 INFO:teuthology.orchestra.run.vm10.stdout: Package Arch Version Repository Size 2026-03-08T22:57:41.303 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:41.303 INFO:teuthology.orchestra.run.vm10.stdout:Removing: 2026-03-08T22:57:41.303 INFO:teuthology.orchestra.run.vm10.stdout: ceph-test x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 210 M 2026-03-08T22:57:41.303 INFO:teuthology.orchestra.run.vm10.stdout:Removing unused dependencies: 2026-03-08T22:57:41.303 INFO:teuthology.orchestra.run.vm10.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-08T22:57:41.303 INFO:teuthology.orchestra.run.vm10.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-08T22:57:41.303 INFO:teuthology.orchestra.run.vm10.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-08T22:57:41.303 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:41.303 INFO:teuthology.orchestra.run.vm10.stdout:Transaction Summary 2026-03-08T22:57:41.303 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:41.303 INFO:teuthology.orchestra.run.vm10.stdout:Remove 4 Packages 2026-03-08T22:57:41.303 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:41.303 INFO:teuthology.orchestra.run.vm10.stdout:Freed space: 212 M 2026-03-08T22:57:41.303 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction check 2026-03-08T22:57:41.306 INFO:teuthology.orchestra.run.vm10.stdout:Transaction check succeeded. 2026-03-08T22:57:41.306 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction test 2026-03-08T22:57:41.327 INFO:teuthology.orchestra.run.vm10.stdout:Transaction test succeeded. 2026-03-08T22:57:41.327 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction 2026-03-08T22:57:41.387 INFO:teuthology.orchestra.run.vm10.stdout: Preparing : 1/1 2026-03-08T22:57:41.394 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ceph-test-2:19.2.3-678.ge911bdeb.el9.x86_64 1/4 2026-03-08T22:57:41.396 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-08T22:57:41.399 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-08T22:57:41.414 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-08T22:57:41.472 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-08T22:57:41.472 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-test-2:19.2.3-678.ge911bdeb.el9.x86_64 1/4 2026-03-08T22:57:41.472 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-08T22:57:41.472 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-08T22:57:41.515 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-08T22:57:41.516 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:41.516 INFO:teuthology.orchestra.run.vm10.stdout:Removed: 2026-03-08T22:57:41.516 INFO:teuthology.orchestra.run.vm10.stdout: ceph-test-2:19.2.3-678.ge911bdeb.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-08T22:57:41.516 INFO:teuthology.orchestra.run.vm10.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-08T22:57:41.516 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:41.516 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:41.732 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:41.733 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:41.733 INFO:teuthology.orchestra.run.vm10.stdout: Package Arch Version Repository Size 2026-03-08T22:57:41.733 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:41.733 INFO:teuthology.orchestra.run.vm10.stdout:Removing: 2026-03-08T22:57:41.733 INFO:teuthology.orchestra.run.vm10.stdout: ceph x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 0 2026-03-08T22:57:41.733 INFO:teuthology.orchestra.run.vm10.stdout:Removing unused dependencies: 2026-03-08T22:57:41.733 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mds x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 7.5 M 2026-03-08T22:57:41.733 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mon x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 18 M 2026-03-08T22:57:41.733 INFO:teuthology.orchestra.run.vm10.stdout: lua x86_64 5.4.4-4.el9 @appstream 593 k 2026-03-08T22:57:41.733 INFO:teuthology.orchestra.run.vm10.stdout: lua-devel x86_64 5.4.4-4.el9 @crb 49 k 2026-03-08T22:57:41.733 INFO:teuthology.orchestra.run.vm10.stdout: luarocks noarch 3.9.2-5.el9 @epel 692 k 2026-03-08T22:57:41.733 INFO:teuthology.orchestra.run.vm10.stdout: unzip x86_64 6.0-59.el9 @baseos 389 k 2026-03-08T22:57:41.733 INFO:teuthology.orchestra.run.vm10.stdout: zip x86_64 3.0-35.el9 @baseos 724 k 2026-03-08T22:57:41.733 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:41.733 INFO:teuthology.orchestra.run.vm10.stdout:Transaction Summary 2026-03-08T22:57:41.733 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:41.733 INFO:teuthology.orchestra.run.vm10.stdout:Remove 8 Packages 2026-03-08T22:57:41.734 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:41.734 INFO:teuthology.orchestra.run.vm10.stdout:Freed space: 28 M 2026-03-08T22:57:41.734 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction check 2026-03-08T22:57:41.736 INFO:teuthology.orchestra.run.vm10.stdout:Transaction check succeeded. 2026-03-08T22:57:41.736 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction test 2026-03-08T22:57:41.759 INFO:teuthology.orchestra.run.vm10.stdout:Transaction test succeeded. 2026-03-08T22:57:41.759 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction 2026-03-08T22:57:41.828 INFO:teuthology.orchestra.run.vm10.stdout: Preparing : 1/1 2026-03-08T22:57:41.833 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ceph-2:19.2.3-678.ge911bdeb.el9.x86_64 1/8 2026-03-08T22:57:41.837 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : luarocks-3.9.2-5.el9.noarch 2/8 2026-03-08T22:57:41.839 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : lua-devel-5.4.4-4.el9.x86_64 3/8 2026-03-08T22:57:41.842 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : zip-3.0-35.el9.x86_64 4/8 2026-03-08T22:57:41.844 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : unzip-6.0-59.el9.x86_64 5/8 2026-03-08T22:57:41.847 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : lua-5.4.4-4.el9.x86_64 6/8 2026-03-08T22:57:41.868 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 7/8 2026-03-08T22:57:41.868 INFO:teuthology.orchestra.run.vm10.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:57:41.868 INFO:teuthology.orchestra.run.vm10.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-08T22:57:41.868 INFO:teuthology.orchestra.run.vm10.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-08T22:57:41.868 INFO:teuthology.orchestra.run.vm10.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-08T22:57:41.868 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:41.869 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 7/8 2026-03-08T22:57:41.876 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 7/8 2026-03-08T22:57:41.897 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 8/8 2026-03-08T22:57:41.897 INFO:teuthology.orchestra.run.vm10.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:57:41.897 INFO:teuthology.orchestra.run.vm10.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-08T22:57:41.897 INFO:teuthology.orchestra.run.vm10.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-08T22:57:41.897 INFO:teuthology.orchestra.run.vm10.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-08T22:57:41.897 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:41.899 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 8/8 2026-03-08T22:57:41.983 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 8/8 2026-03-08T22:57:41.983 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-2:19.2.3-678.ge911bdeb.el9.x86_64 1/8 2026-03-08T22:57:41.983 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 2/8 2026-03-08T22:57:41.983 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 3/8 2026-03-08T22:57:41.983 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : lua-5.4.4-4.el9.x86_64 4/8 2026-03-08T22:57:41.983 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : lua-devel-5.4.4-4.el9.x86_64 5/8 2026-03-08T22:57:41.983 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : luarocks-3.9.2-5.el9.noarch 6/8 2026-03-08T22:57:41.983 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : unzip-6.0-59.el9.x86_64 7/8 2026-03-08T22:57:42.032 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : zip-3.0-35.el9.x86_64 8/8 2026-03-08T22:57:42.032 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:42.032 INFO:teuthology.orchestra.run.vm10.stdout:Removed: 2026-03-08T22:57:42.032 INFO:teuthology.orchestra.run.vm10.stdout: ceph-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:42.032 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:42.032 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:42.032 INFO:teuthology.orchestra.run.vm10.stdout: lua-5.4.4-4.el9.x86_64 2026-03-08T22:57:42.032 INFO:teuthology.orchestra.run.vm10.stdout: lua-devel-5.4.4-4.el9.x86_64 2026-03-08T22:57:42.032 INFO:teuthology.orchestra.run.vm10.stdout: luarocks-3.9.2-5.el9.noarch 2026-03-08T22:57:42.032 INFO:teuthology.orchestra.run.vm10.stdout: unzip-6.0-59.el9.x86_64 2026-03-08T22:57:42.032 INFO:teuthology.orchestra.run.vm10.stdout: zip-3.0-35.el9.x86_64 2026-03-08T22:57:42.032 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:42.032 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:42.241 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:42.246 INFO:teuthology.orchestra.run.vm10.stdout:=========================================================================================== 2026-03-08T22:57:42.246 INFO:teuthology.orchestra.run.vm10.stdout: Package Arch Version Repository Size 2026-03-08T22:57:42.246 INFO:teuthology.orchestra.run.vm10.stdout:=========================================================================================== 2026-03-08T22:57:42.246 INFO:teuthology.orchestra.run.vm10.stdout:Removing: 2026-03-08T22:57:42.246 INFO:teuthology.orchestra.run.vm10.stdout: ceph-base x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 23 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout:Removing dependent packages: 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: ceph-immutable-object-cache x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 431 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 3.4 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-cephadm noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 806 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-dashboard noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 88 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-diskprediction-local noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 66 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-rook noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 563 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: ceph-osd x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 59 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: ceph-volume noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 1.4 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: rbd-mirror x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 13 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout:Removing unused dependencies: 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: abseil-cpp x86_64 20211102.0-4.el9 @epel 1.9 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: ceph-common x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 85 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: ceph-grafana-dashboards noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 628 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-modules-core noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 1.5 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: ceph-prometheus-alerts noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 52 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: ceph-selinux x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 138 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: cryptsetup x86_64 2.8.1-3.el9 @baseos 770 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: grpc-data noarch 1.46.7-10.el9 @epel 13 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: libcephsqlite x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 425 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: libradosstriper1 x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 1.6 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: pciutils x86_64 3.7.0-7.el9 @baseos 216 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: protobuf x86_64 3.14.0-17.el9 @appstream 3.5 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: protobuf-compiler x86_64 3.14.0-17.el9 @crb 2.9 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-ceph-common x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 702 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-grpcio x86_64 1.46.7-10.el9 @epel 6.7 M 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-grpcio-tools x86_64 1.46.7-10.el9 @epel 418 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-08T22:57:42.247 INFO:teuthology.orchestra.run.vm10.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-packaging noarch 20.9-5.el9 @appstream 248 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-protobuf noarch 3.14.0-17.el9 @appstream 1.4 M 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-pyparsing noarch 2.4.7-9.el9 @baseos 635 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: qatlib x86_64 25.08.0-2.el9 @appstream 639 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: qatlib-service x86_64 25.08.0-2.el9 @appstream 69 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: qatzip-libs x86_64 1.3.1-1.el9 @appstream 148 k 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout:Transaction Summary 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout:=========================================================================================== 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout:Remove 103 Packages 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout:Freed space: 613 M 2026-03-08T22:57:42.248 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction check 2026-03-08T22:57:42.275 INFO:teuthology.orchestra.run.vm10.stdout:Transaction check succeeded. 2026-03-08T22:57:42.275 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction test 2026-03-08T22:57:42.382 INFO:teuthology.orchestra.run.vm10.stdout:Transaction test succeeded. 2026-03-08T22:57:42.382 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction 2026-03-08T22:57:42.526 INFO:teuthology.orchestra.run.vm10.stdout: Preparing : 1/1 2026-03-08T22:57:42.526 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 1/103 2026-03-08T22:57:42.537 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 1/103 2026-03-08T22:57:42.556 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 2/103 2026-03-08T22:57:42.556 INFO:teuthology.orchestra.run.vm10.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:57:42.556 INFO:teuthology.orchestra.run.vm10.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-08T22:57:42.556 INFO:teuthology.orchestra.run.vm10.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-08T22:57:42.556 INFO:teuthology.orchestra.run.vm10.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-08T22:57:42.556 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:42.557 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 2/103 2026-03-08T22:57:42.575 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 2/103 2026-03-08T22:57:42.601 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ceph-mgr-modules-core-2:19.2.3-678.ge911bdeb.el9 3/103 2026-03-08T22:57:42.601 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.no 4/103 2026-03-08T22:57:42.656 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.no 4/103 2026-03-08T22:57:42.669 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 5/103 2026-03-08T22:57:42.674 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/103 2026-03-08T22:57:42.675 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noar 7/103 2026-03-08T22:57:42.691 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noar 7/103 2026-03-08T22:57:42.698 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/103 2026-03-08T22:57:42.703 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/103 2026-03-08T22:57:42.715 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-grpcio-tools-1.46.7-10.el9.x86_64 10/103 2026-03-08T22:57:42.724 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-grpcio-1.46.7-10.el9.x86_64 11/103 2026-03-08T22:57:42.748 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 12/103 2026-03-08T22:57:42.748 INFO:teuthology.orchestra.run.vm10.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:57:42.748 INFO:teuthology.orchestra.run.vm10.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-08T22:57:42.748 INFO:teuthology.orchestra.run.vm10.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-08T22:57:42.748 INFO:teuthology.orchestra.run.vm10.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-08T22:57:42.748 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:42.753 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 12/103 2026-03-08T22:57:42.762 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 12/103 2026-03-08T22:57:42.777 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 13/103 2026-03-08T22:57:42.777 INFO:teuthology.orchestra.run.vm10.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:57:42.777 INFO:teuthology.orchestra.run.vm10.stdout:Invalid unit name "ceph-volume@*.service" escaped as "ceph-volume@\x2a.service". 2026-03-08T22:57:42.777 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:42.785 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 13/103 2026-03-08T22:57:42.795 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 13/103 2026-03-08T22:57:42.802 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 14/103 2026-03-08T22:57:42.812 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 15/103 2026-03-08T22:57:42.819 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 16/103 2026-03-08T22:57:42.832 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 17/103 2026-03-08T22:57:42.849 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 18/103 2026-03-08T22:57:42.857 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 19/103 2026-03-08T22:57:42.871 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 20/103 2026-03-08T22:57:42.881 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 21/103 2026-03-08T22:57:42.918 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 22/103 2026-03-08T22:57:42.930 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 23/103 2026-03-08T22:57:42.935 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 24/103 2026-03-08T22:57:42.948 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 25/103 2026-03-08T22:57:42.962 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 26/103 2026-03-08T22:57:42.962 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ceph-mgr-diskprediction-local-2:19.2.3-678.ge911 27/103 2026-03-08T22:57:42.978 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:19.2.3-678.ge911 27/103 2026-03-08T22:57:43.078 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 28/103 2026-03-08T22:57:43.102 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 29/103 2026-03-08T22:57:43.117 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 30/103 2026-03-08T22:57:43.117 INFO:teuthology.orchestra.run.vm10.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-08T22:57:43.117 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:43.118 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 30/103 2026-03-08T22:57:43.151 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 30/103 2026-03-08T22:57:43.169 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 31/103 2026-03-08T22:57:43.175 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 32/103 2026-03-08T22:57:43.179 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : protobuf-compiler-3.14.0-17.el9.x86_64 33/103 2026-03-08T22:57:43.182 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 34/103 2026-03-08T22:57:43.206 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 35/103 2026-03-08T22:57:43.206 INFO:teuthology.orchestra.run.vm10.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:57:43.206 INFO:teuthology.orchestra.run.vm10.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-08T22:57:43.206 INFO:teuthology.orchestra.run.vm10.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-08T22:57:43.206 INFO:teuthology.orchestra.run.vm10.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-08T22:57:43.206 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:43.207 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 35/103 2026-03-08T22:57:43.222 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 35/103 2026-03-08T22:57:43.226 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 36/103 2026-03-08T22:57:43.230 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 37/103 2026-03-08T22:57:43.235 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-packaging-20.9-5.el9.noarch 38/103 2026-03-08T22:57:43.238 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 39/103 2026-03-08T22:57:43.243 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 40/103 2026-03-08T22:57:43.248 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 41/103 2026-03-08T22:57:43.253 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 42/103 2026-03-08T22:57:43.260 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 43/103 2026-03-08T22:57:43.307 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 44/103 2026-03-08T22:57:43.320 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 45/103 2026-03-08T22:57:43.323 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 46/103 2026-03-08T22:57:43.328 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 47/103 2026-03-08T22:57:43.330 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 48/103 2026-03-08T22:57:43.334 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 49/103 2026-03-08T22:57:43.337 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 50/103 2026-03-08T22:57:43.362 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-immutable-object-cache-2:19.2.3-678.ge911bd 51/103 2026-03-08T22:57:43.362 INFO:teuthology.orchestra.run.vm10.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:57:43.362 INFO:teuthology.orchestra.run.vm10.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-08T22:57:43.362 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:43.362 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ceph-immutable-object-cache-2:19.2.3-678.ge911bd 51/103 2026-03-08T22:57:43.372 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-immutable-object-cache-2:19.2.3-678.ge911bd 51/103 2026-03-08T22:57:43.374 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 52/103 2026-03-08T22:57:43.376 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 53/103 2026-03-08T22:57:43.380 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-ply-3.11-14.el9.noarch 54/103 2026-03-08T22:57:43.382 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 55/103 2026-03-08T22:57:43.385 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 56/103 2026-03-08T22:57:43.388 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 57/103 2026-03-08T22:57:43.391 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 58/103 2026-03-08T22:57:43.394 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 59/103 2026-03-08T22:57:43.396 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-pyparsing-2.4.7-9.el9.noarch 60/103 2026-03-08T22:57:43.404 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 61/103 2026-03-08T22:57:43.408 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 62/103 2026-03-08T22:57:43.410 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 63/103 2026-03-08T22:57:43.412 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 64/103 2026-03-08T22:57:43.415 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 65/103 2026-03-08T22:57:43.421 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 66/103 2026-03-08T22:57:43.425 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 67/103 2026-03-08T22:57:43.430 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 68/103 2026-03-08T22:57:43.434 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 69/103 2026-03-08T22:57:43.440 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 70/103 2026-03-08T22:57:43.444 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 71/103 2026-03-08T22:57:43.447 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 72/103 2026-03-08T22:57:43.452 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : grpc-data-1.46.7-10.el9.noarch 73/103 2026-03-08T22:57:43.455 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-protobuf-3.14.0-17.el9.noarch 74/103 2026-03-08T22:57:43.459 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 75/103 2026-03-08T22:57:43.468 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 76/103 2026-03-08T22:57:43.475 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 77/103 2026-03-08T22:57:43.478 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 78/103 2026-03-08T22:57:43.481 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 79/103 2026-03-08T22:57:43.483 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ceph-grafana-dashboards-2:19.2.3-678.ge911bdeb.e 80/103 2026-03-08T22:57:43.488 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ceph-prometheus-alerts-2:19.2.3-678.ge911bdeb.el 81/103 2026-03-08T22:57:43.492 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 82/103 2026-03-08T22:57:43.514 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 83/103 2026-03-08T22:57:43.514 INFO:teuthology.orchestra.run.vm10.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph.target". 2026-03-08T22:57:43.514 INFO:teuthology.orchestra.run.vm10.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-08T22:57:43.514 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:43.521 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 83/103 2026-03-08T22:57:43.550 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 83/103 2026-03-08T22:57:43.550 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 84/103 2026-03-08T22:57:43.561 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 84/103 2026-03-08T22:57:43.565 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : qatzip-libs-1.3.1-1.el9.x86_64 85/103 2026-03-08T22:57:43.568 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-ceph-common-2:19.2.3-678.ge911bdeb.el9.x 86/103 2026-03-08T22:57:43.570 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 87/103 2026-03-08T22:57:43.570 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 88/103 2026-03-08T22:57:48.711 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 88/103 2026-03-08T22:57:48.711 INFO:teuthology.orchestra.run.vm10.stdout:skipping the directory /sys 2026-03-08T22:57:48.711 INFO:teuthology.orchestra.run.vm10.stdout:skipping the directory /proc 2026-03-08T22:57:48.711 INFO:teuthology.orchestra.run.vm10.stdout:skipping the directory /mnt 2026-03-08T22:57:48.711 INFO:teuthology.orchestra.run.vm10.stdout:skipping the directory /var/tmp 2026-03-08T22:57:48.711 INFO:teuthology.orchestra.run.vm10.stdout:skipping the directory /home 2026-03-08T22:57:48.711 INFO:teuthology.orchestra.run.vm10.stdout:skipping the directory /root 2026-03-08T22:57:48.711 INFO:teuthology.orchestra.run.vm10.stdout:skipping the directory /tmp 2026-03-08T22:57:48.711 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:48.719 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : qatlib-25.08.0-2.el9.x86_64 89/103 2026-03-08T22:57:48.735 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: qatlib-service-25.08.0-2.el9.x86_64 90/103 2026-03-08T22:57:48.735 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : qatlib-service-25.08.0-2.el9.x86_64 90/103 2026-03-08T22:57:48.741 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: qatlib-service-25.08.0-2.el9.x86_64 90/103 2026-03-08T22:57:48.744 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 91/103 2026-03-08T22:57:48.747 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 92/103 2026-03-08T22:57:48.749 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : pciutils-3.7.0-7.el9.x86_64 93/103 2026-03-08T22:57:48.751 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 94/103 2026-03-08T22:57:48.751 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_ 95/103 2026-03-08T22:57:48.763 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_ 95/103 2026-03-08T22:57:48.765 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 96/103 2026-03-08T22:57:48.768 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 97/103 2026-03-08T22:57:48.770 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 98/103 2026-03-08T22:57:48.773 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : protobuf-3.14.0-17.el9.x86_64 99/103 2026-03-08T22:57:48.778 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 100/103 2026-03-08T22:57:48.786 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : cryptsetup-2.8.1-3.el9.x86_64 101/103 2026-03-08T22:57:48.790 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : abseil-cpp-20211102.0-4.el9.x86_64 102/103 2026-03-08T22:57:48.790 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 103/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 103/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : abseil-cpp-20211102.0-4.el9.x86_64 1/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 2/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 3/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-grafana-dashboards-2:19.2.3-678.ge911bdeb.e 4/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-immutable-object-cache-2:19.2.3-678.ge911bd 5/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 6/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noar 7/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.no 8/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-mgr-diskprediction-local-2:19.2.3-678.ge911 9/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-mgr-modules-core-2:19.2.3-678.ge911bdeb.el9 10/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 11/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 12/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-prometheus-alerts-2:19.2.3-678.ge911bdeb.el 13/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 14/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 15/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : cryptsetup-2.8.1-3.el9.x86_64 16/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 17/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 18/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 19/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 20/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : grpc-data-1.46.7-10.el9.noarch 21/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 22/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 23/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 24/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 25/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 26/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 27/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_ 28/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 29/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 30/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 31/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 32/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : pciutils-3.7.0-7.el9.x86_64 33/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : protobuf-3.14.0-17.el9.x86_64 34/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : protobuf-compiler-3.14.0-17.el9.x86_64 35/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 36/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 37/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 38/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 39/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 40/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 41/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-ceph-common-2:19.2.3-678.ge911bdeb.el9.x 42/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 43/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 44/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 45/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 46/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 47/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 48/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 49/103 2026-03-08T22:57:48.888 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 50/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-grpcio-1.46.7-10.el9.x86_64 51/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-grpcio-tools-1.46.7-10.el9.x86_64 52/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 53/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 54/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 55/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 56/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 57/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 58/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 59/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 60/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 61/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 62/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 63/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 64/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 65/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 66/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 67/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 68/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 69/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 70/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 71/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 72/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-packaging-20.9-5.el9.noarch 73/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 74/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-ply-3.11-14.el9.noarch 75/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 76/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 77/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-protobuf-3.14.0-17.el9.noarch 78/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 79/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 80/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 81/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 82/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-pyparsing-2.4.7-9.el9.noarch 83/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 84/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 85/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 86/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 87/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 88/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 89/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 90/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 91/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 92/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 93/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 94/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 95/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 96/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 97/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 98/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 99/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : qatlib-25.08.0-2.el9.x86_64 100/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : qatlib-service-25.08.0-2.el9.x86_64 101/103 2026-03-08T22:57:48.889 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : qatzip-libs-1.3.1-1.el9.x86_64 102/103 2026-03-08T22:57:48.960 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 103/103 2026-03-08T22:57:48.960 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:48.960 INFO:teuthology.orchestra.run.vm10.stdout:Removed: 2026-03-08T22:57:48.960 INFO:teuthology.orchestra.run.vm10.stdout: abseil-cpp-20211102.0-4.el9.x86_64 2026-03-08T22:57:48.960 INFO:teuthology.orchestra.run.vm10.stdout: ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:48.960 INFO:teuthology.orchestra.run.vm10.stdout: ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:48.960 INFO:teuthology.orchestra.run.vm10.stdout: ceph-grafana-dashboards-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:57:48.960 INFO:teuthology.orchestra.run.vm10.stdout: ceph-immutable-object-cache-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:48.960 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:48.960 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:57:48.960 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-diskprediction-local-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-modules-core-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: ceph-prometheus-alerts-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: cryptsetup-2.8.1-3.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: grpc-data-1.46.7-10.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: pciutils-3.7.0-7.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: protobuf-3.14.0-17.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: protobuf-compiler-3.14.0-17.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-grpcio-1.46.7-10.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-grpcio-tools-1.46.7-10.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-08T22:57:48.961 INFO:teuthology.orchestra.run.vm10.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-packaging-20.9-5.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-ply-3.11-14.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-protobuf-3.14.0-17.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-pyparsing-2.4.7-9.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: qatlib-25.08.0-2.el9.x86_64 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: qatlib-service-25.08.0-2.el9.x86_64 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: qatzip-libs-1.3.1-1.el9.x86_64 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:48.962 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:49.147 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:49.147 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:49.147 INFO:teuthology.orchestra.run.vm10.stdout: Package Arch Version Repository Size 2026-03-08T22:57:49.148 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:49.148 INFO:teuthology.orchestra.run.vm10.stdout:Removing: 2026-03-08T22:57:49.148 INFO:teuthology.orchestra.run.vm10.stdout: cephadm noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 775 k 2026-03-08T22:57:49.148 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:49.148 INFO:teuthology.orchestra.run.vm10.stdout:Transaction Summary 2026-03-08T22:57:49.148 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:49.148 INFO:teuthology.orchestra.run.vm10.stdout:Remove 1 Package 2026-03-08T22:57:49.148 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:49.148 INFO:teuthology.orchestra.run.vm10.stdout:Freed space: 775 k 2026-03-08T22:57:49.148 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction check 2026-03-08T22:57:49.149 INFO:teuthology.orchestra.run.vm10.stdout:Transaction check succeeded. 2026-03-08T22:57:49.149 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction test 2026-03-08T22:57:49.150 INFO:teuthology.orchestra.run.vm10.stdout:Transaction test succeeded. 2026-03-08T22:57:49.150 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction 2026-03-08T22:57:49.165 INFO:teuthology.orchestra.run.vm10.stdout: Preparing : 1/1 2026-03-08T22:57:49.165 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 1/1 2026-03-08T22:57:49.304 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 1/1 2026-03-08T22:57:49.381 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 1/1 2026-03-08T22:57:49.382 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:49.382 INFO:teuthology.orchestra.run.vm10.stdout:Removed: 2026-03-08T22:57:49.382 INFO:teuthology.orchestra.run.vm10.stdout: cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:57:49.382 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:49.382 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:49.610 INFO:teuthology.orchestra.run.vm10.stdout:No match for argument: ceph-immutable-object-cache 2026-03-08T22:57:49.610 INFO:teuthology.orchestra.run.vm10.stderr:No packages marked for removal. 2026-03-08T22:57:49.613 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:49.613 INFO:teuthology.orchestra.run.vm10.stdout:Nothing to do. 2026-03-08T22:57:49.613 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:49.772 INFO:teuthology.orchestra.run.vm10.stdout:No match for argument: ceph-mgr 2026-03-08T22:57:49.773 INFO:teuthology.orchestra.run.vm10.stderr:No packages marked for removal. 2026-03-08T22:57:49.776 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:49.776 INFO:teuthology.orchestra.run.vm10.stdout:Nothing to do. 2026-03-08T22:57:49.776 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:49.950 INFO:teuthology.orchestra.run.vm10.stdout:No match for argument: ceph-mgr-dashboard 2026-03-08T22:57:49.950 INFO:teuthology.orchestra.run.vm10.stderr:No packages marked for removal. 2026-03-08T22:57:49.953 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:49.954 INFO:teuthology.orchestra.run.vm10.stdout:Nothing to do. 2026-03-08T22:57:49.954 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:50.120 INFO:teuthology.orchestra.run.vm10.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-08T22:57:50.120 INFO:teuthology.orchestra.run.vm10.stderr:No packages marked for removal. 2026-03-08T22:57:50.123 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:50.123 INFO:teuthology.orchestra.run.vm10.stdout:Nothing to do. 2026-03-08T22:57:50.123 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:50.280 INFO:teuthology.orchestra.run.vm10.stdout:No match for argument: ceph-mgr-rook 2026-03-08T22:57:50.280 INFO:teuthology.orchestra.run.vm10.stderr:No packages marked for removal. 2026-03-08T22:57:50.283 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:50.283 INFO:teuthology.orchestra.run.vm10.stdout:Nothing to do. 2026-03-08T22:57:50.283 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:50.434 INFO:teuthology.orchestra.run.vm10.stdout:No match for argument: ceph-mgr-cephadm 2026-03-08T22:57:50.434 INFO:teuthology.orchestra.run.vm10.stderr:No packages marked for removal. 2026-03-08T22:57:50.437 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:50.437 INFO:teuthology.orchestra.run.vm10.stdout:Nothing to do. 2026-03-08T22:57:50.437 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:50.598 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:50.598 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:50.598 INFO:teuthology.orchestra.run.vm10.stdout: Package Arch Version Repository Size 2026-03-08T22:57:50.598 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:50.598 INFO:teuthology.orchestra.run.vm10.stdout:Removing: 2026-03-08T22:57:50.598 INFO:teuthology.orchestra.run.vm10.stdout: ceph-fuse x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 3.6 M 2026-03-08T22:57:50.598 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:50.598 INFO:teuthology.orchestra.run.vm10.stdout:Transaction Summary 2026-03-08T22:57:50.598 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:50.598 INFO:teuthology.orchestra.run.vm10.stdout:Remove 1 Package 2026-03-08T22:57:50.598 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:50.598 INFO:teuthology.orchestra.run.vm10.stdout:Freed space: 3.6 M 2026-03-08T22:57:50.598 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction check 2026-03-08T22:57:50.600 INFO:teuthology.orchestra.run.vm10.stdout:Transaction check succeeded. 2026-03-08T22:57:50.600 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction test 2026-03-08T22:57:50.608 INFO:teuthology.orchestra.run.vm10.stdout:Transaction test succeeded. 2026-03-08T22:57:50.608 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction 2026-03-08T22:57:50.631 INFO:teuthology.orchestra.run.vm10.stdout: Preparing : 1/1 2026-03-08T22:57:50.644 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 1/1 2026-03-08T22:57:50.700 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 1/1 2026-03-08T22:57:50.738 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 1/1 2026-03-08T22:57:50.738 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:50.738 INFO:teuthology.orchestra.run.vm10.stdout:Removed: 2026-03-08T22:57:50.738 INFO:teuthology.orchestra.run.vm10.stdout: ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:50.738 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:50.738 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:50.912 INFO:teuthology.orchestra.run.vm10.stdout:No match for argument: ceph-volume 2026-03-08T22:57:50.912 INFO:teuthology.orchestra.run.vm10.stderr:No packages marked for removal. 2026-03-08T22:57:50.915 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:50.915 INFO:teuthology.orchestra.run.vm10.stdout:Nothing to do. 2026-03-08T22:57:50.915 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:51.080 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:51.080 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:51.080 INFO:teuthology.orchestra.run.vm10.stdout: Package Arch Version Repo Size 2026-03-08T22:57:51.080 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:51.080 INFO:teuthology.orchestra.run.vm10.stdout:Removing: 2026-03-08T22:57:51.080 INFO:teuthology.orchestra.run.vm10.stdout: librados-devel x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 456 k 2026-03-08T22:57:51.080 INFO:teuthology.orchestra.run.vm10.stdout:Removing dependent packages: 2026-03-08T22:57:51.080 INFO:teuthology.orchestra.run.vm10.stdout: libcephfs-devel x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 153 k 2026-03-08T22:57:51.080 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:51.080 INFO:teuthology.orchestra.run.vm10.stdout:Transaction Summary 2026-03-08T22:57:51.080 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:51.080 INFO:teuthology.orchestra.run.vm10.stdout:Remove 2 Packages 2026-03-08T22:57:51.081 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:51.081 INFO:teuthology.orchestra.run.vm10.stdout:Freed space: 610 k 2026-03-08T22:57:51.081 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction check 2026-03-08T22:57:51.082 INFO:teuthology.orchestra.run.vm10.stdout:Transaction check succeeded. 2026-03-08T22:57:51.082 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction test 2026-03-08T22:57:51.091 INFO:teuthology.orchestra.run.vm10.stdout:Transaction test succeeded. 2026-03-08T22:57:51.091 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction 2026-03-08T22:57:51.114 INFO:teuthology.orchestra.run.vm10.stdout: Preparing : 1/1 2026-03-08T22:57:51.116 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : libcephfs-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 1/2 2026-03-08T22:57:51.128 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2/2 2026-03-08T22:57:51.195 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2/2 2026-03-08T22:57:51.195 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libcephfs-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 1/2 2026-03-08T22:57:51.245 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2/2 2026-03-08T22:57:51.245 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:51.245 INFO:teuthology.orchestra.run.vm10.stdout:Removed: 2026-03-08T22:57:51.245 INFO:teuthology.orchestra.run.vm10.stdout: libcephfs-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:51.245 INFO:teuthology.orchestra.run.vm10.stdout: librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:51.245 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:51.245 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:51.420 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:51.420 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:51.420 INFO:teuthology.orchestra.run.vm10.stdout: Package Arch Version Repo Size 2026-03-08T22:57:51.420 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:51.420 INFO:teuthology.orchestra.run.vm10.stdout:Removing: 2026-03-08T22:57:51.421 INFO:teuthology.orchestra.run.vm10.stdout: libcephfs2 x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 3.0 M 2026-03-08T22:57:51.421 INFO:teuthology.orchestra.run.vm10.stdout:Removing dependent packages: 2026-03-08T22:57:51.421 INFO:teuthology.orchestra.run.vm10.stdout: python3-cephfs x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 514 k 2026-03-08T22:57:51.421 INFO:teuthology.orchestra.run.vm10.stdout:Removing unused dependencies: 2026-03-08T22:57:51.421 INFO:teuthology.orchestra.run.vm10.stdout: python3-ceph-argparse x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 187 k 2026-03-08T22:57:51.421 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:51.421 INFO:teuthology.orchestra.run.vm10.stdout:Transaction Summary 2026-03-08T22:57:51.421 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:51.421 INFO:teuthology.orchestra.run.vm10.stdout:Remove 3 Packages 2026-03-08T22:57:51.421 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:51.421 INFO:teuthology.orchestra.run.vm10.stdout:Freed space: 3.7 M 2026-03-08T22:57:51.421 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction check 2026-03-08T22:57:51.422 INFO:teuthology.orchestra.run.vm10.stdout:Transaction check succeeded. 2026-03-08T22:57:51.423 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction test 2026-03-08T22:57:51.438 INFO:teuthology.orchestra.run.vm10.stdout:Transaction test succeeded. 2026-03-08T22:57:51.438 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction 2026-03-08T22:57:51.468 INFO:teuthology.orchestra.run.vm10.stdout: Preparing : 1/1 2026-03-08T22:57:51.470 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-cephfs-2:19.2.3-678.ge911bdeb.el9.x86_64 1/3 2026-03-08T22:57:51.471 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-ceph-argparse-2:19.2.3-678.ge911bdeb.el9.x86 2/3 2026-03-08T22:57:51.472 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 3/3 2026-03-08T22:57:51.531 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 3/3 2026-03-08T22:57:51.531 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 1/3 2026-03-08T22:57:51.531 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-ceph-argparse-2:19.2.3-678.ge911bdeb.el9.x86 2/3 2026-03-08T22:57:51.565 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-cephfs-2:19.2.3-678.ge911bdeb.el9.x86_64 3/3 2026-03-08T22:57:51.566 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:51.566 INFO:teuthology.orchestra.run.vm10.stdout:Removed: 2026-03-08T22:57:51.566 INFO:teuthology.orchestra.run.vm10.stdout: libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:51.566 INFO:teuthology.orchestra.run.vm10.stdout: python3-ceph-argparse-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:51.566 INFO:teuthology.orchestra.run.vm10.stdout: python3-cephfs-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:51.566 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:51.566 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:51.723 INFO:teuthology.orchestra.run.vm10.stdout:No match for argument: libcephfs-devel 2026-03-08T22:57:51.723 INFO:teuthology.orchestra.run.vm10.stderr:No packages marked for removal. 2026-03-08T22:57:51.726 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:51.726 INFO:teuthology.orchestra.run.vm10.stdout:Nothing to do. 2026-03-08T22:57:51.727 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:51.893 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: Package Arch Version Repository Size 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout:Removing: 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: librados2 x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 12 M 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout:Removing dependent packages: 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: python3-rados x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 1.1 M 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: python3-rbd x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 1.1 M 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: python3-rgw x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 265 k 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: rbd-fuse x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 227 k 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: rbd-nbd x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 490 k 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout:Removing unused dependencies: 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: libnbd x86_64 1.20.3-4.el9 @appstream 453 k 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: librbd1 x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 13 M 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: librgw2 x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 19 M 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout:Transaction Summary 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout:================================================================================ 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout:Remove 20 Packages 2026-03-08T22:57:51.894 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:51.895 INFO:teuthology.orchestra.run.vm10.stdout:Freed space: 79 M 2026-03-08T22:57:51.895 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction check 2026-03-08T22:57:51.898 INFO:teuthology.orchestra.run.vm10.stdout:Transaction check succeeded. 2026-03-08T22:57:51.898 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction test 2026-03-08T22:57:51.920 INFO:teuthology.orchestra.run.vm10.stdout:Transaction test succeeded. 2026-03-08T22:57:51.920 INFO:teuthology.orchestra.run.vm10.stdout:Running transaction 2026-03-08T22:57:51.982 INFO:teuthology.orchestra.run.vm10.stdout: Preparing : 1/1 2026-03-08T22:57:51.992 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : rbd-nbd-2:19.2.3-678.ge911bdeb.el9.x86_64 1/20 2026-03-08T22:57:51.994 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : rbd-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 2/20 2026-03-08T22:57:52.000 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-rgw-2:19.2.3-678.ge911bdeb.el9.x86_64 3/20 2026-03-08T22:57:52.001 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 4/20 2026-03-08T22:57:52.021 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 4/20 2026-03-08T22:57:52.026 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/20 2026-03-08T22:57:52.035 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-rbd-2:19.2.3-678.ge911bdeb.el9.x86_64 6/20 2026-03-08T22:57:52.038 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : python3-rados-2:19.2.3-678.ge911bdeb.el9.x86_64 7/20 2026-03-08T22:57:52.041 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/20 2026-03-08T22:57:52.049 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 9/20 2026-03-08T22:57:52.049 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 10/20 2026-03-08T22:57:52.071 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 10/20 2026-03-08T22:57:52.071 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 11/20 2026-03-08T22:57:52.071 INFO:teuthology.orchestra.run.vm10.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-08T22:57:52.071 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:52.093 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 11/20 2026-03-08T22:57:52.099 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 12/20 2026-03-08T22:57:52.107 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : re2-1:20211101-20.el9.x86_64 13/20 2026-03-08T22:57:52.116 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 14/20 2026-03-08T22:57:52.121 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 15/20 2026-03-08T22:57:52.127 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : libnbd-1.20.3-4.el9.x86_64 16/20 2026-03-08T22:57:52.130 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 17/20 2026-03-08T22:57:52.141 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 18/20 2026-03-08T22:57:52.149 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 19/20 2026-03-08T22:57:52.168 INFO:teuthology.orchestra.run.vm10.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 20/20 2026-03-08T22:57:52.232 INFO:teuthology.orchestra.run.vm10.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 20/20 2026-03-08T22:57:52.232 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/20 2026-03-08T22:57:52.232 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 2/20 2026-03-08T22:57:52.232 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 3/20 2026-03-08T22:57:52.232 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libnbd-1.20.3-4.el9.x86_64 4/20 2026-03-08T22:57:52.232 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 5/20 2026-03-08T22:57:52.232 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 6/20 2026-03-08T22:57:52.232 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 7/20 2026-03-08T22:57:52.232 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 8/20 2026-03-08T22:57:52.232 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 9/20 2026-03-08T22:57:52.232 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 10/20 2026-03-08T22:57:52.232 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 11/20 2026-03-08T22:57:52.232 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 12/20 2026-03-08T22:57:52.232 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-rados-2:19.2.3-678.ge911bdeb.el9.x86_64 13/20 2026-03-08T22:57:52.232 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-rbd-2:19.2.3-678.ge911bdeb.el9.x86_64 14/20 2026-03-08T22:57:52.232 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : python3-rgw-2:19.2.3-678.ge911bdeb.el9.x86_64 15/20 2026-03-08T22:57:52.232 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 16/20 2026-03-08T22:57:52.232 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : rbd-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 17/20 2026-03-08T22:57:52.232 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : rbd-nbd-2:19.2.3-678.ge911bdeb.el9.x86_64 18/20 2026-03-08T22:57:52.232 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : re2-1:20211101-20.el9.x86_64 19/20 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 20/20 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout:Removed: 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: libnbd-1.20.3-4.el9.x86_64 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: python3-rados-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: python3-rbd-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: python3-rgw-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: rbd-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: rbd-nbd-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: re2-1:20211101-20.el9.x86_64 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout: 2026-03-08T22:57:52.313 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:52.533 INFO:teuthology.orchestra.run.vm10.stdout:No match for argument: librbd1 2026-03-08T22:57:52.533 INFO:teuthology.orchestra.run.vm10.stderr:No packages marked for removal. 2026-03-08T22:57:52.536 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:52.536 INFO:teuthology.orchestra.run.vm10.stdout:Nothing to do. 2026-03-08T22:57:52.537 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:52.700 INFO:teuthology.orchestra.run.vm10.stdout:No match for argument: python3-rados 2026-03-08T22:57:52.701 INFO:teuthology.orchestra.run.vm10.stderr:No packages marked for removal. 2026-03-08T22:57:52.703 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:52.703 INFO:teuthology.orchestra.run.vm10.stdout:Nothing to do. 2026-03-08T22:57:52.703 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:52.876 INFO:teuthology.orchestra.run.vm10.stdout:No match for argument: python3-rgw 2026-03-08T22:57:52.876 INFO:teuthology.orchestra.run.vm10.stderr:No packages marked for removal. 2026-03-08T22:57:52.877 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:52.878 INFO:teuthology.orchestra.run.vm10.stdout:Nothing to do. 2026-03-08T22:57:52.878 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:53.041 INFO:teuthology.orchestra.run.vm10.stdout:No match for argument: python3-cephfs 2026-03-08T22:57:53.041 INFO:teuthology.orchestra.run.vm10.stderr:No packages marked for removal. 2026-03-08T22:57:53.043 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:53.044 INFO:teuthology.orchestra.run.vm10.stdout:Nothing to do. 2026-03-08T22:57:53.044 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:53.218 INFO:teuthology.orchestra.run.vm10.stdout:No match for argument: python3-rbd 2026-03-08T22:57:53.218 INFO:teuthology.orchestra.run.vm10.stderr:No packages marked for removal. 2026-03-08T22:57:53.219 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:53.220 INFO:teuthology.orchestra.run.vm10.stdout:Nothing to do. 2026-03-08T22:57:53.220 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:53.385 INFO:teuthology.orchestra.run.vm10.stdout:No match for argument: rbd-fuse 2026-03-08T22:57:53.385 INFO:teuthology.orchestra.run.vm10.stderr:No packages marked for removal. 2026-03-08T22:57:53.387 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:53.388 INFO:teuthology.orchestra.run.vm10.stdout:Nothing to do. 2026-03-08T22:57:53.388 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:53.547 INFO:teuthology.orchestra.run.vm10.stdout:No match for argument: rbd-mirror 2026-03-08T22:57:53.547 INFO:teuthology.orchestra.run.vm10.stderr:No packages marked for removal. 2026-03-08T22:57:53.549 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:53.550 INFO:teuthology.orchestra.run.vm10.stdout:Nothing to do. 2026-03-08T22:57:53.550 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:53.726 INFO:teuthology.orchestra.run.vm10.stdout:No match for argument: rbd-nbd 2026-03-08T22:57:53.726 INFO:teuthology.orchestra.run.vm10.stderr:No packages marked for removal. 2026-03-08T22:57:53.728 INFO:teuthology.orchestra.run.vm10.stdout:Dependencies resolved. 2026-03-08T22:57:53.728 INFO:teuthology.orchestra.run.vm10.stdout:Nothing to do. 2026-03-08T22:57:53.728 INFO:teuthology.orchestra.run.vm10.stdout:Complete! 2026-03-08T22:57:53.750 DEBUG:teuthology.orchestra.run.vm10:> sudo yum clean all 2026-03-08T22:57:53.882 INFO:teuthology.orchestra.run.vm10.stdout:56 files removed 2026-03-08T22:57:53.902 DEBUG:teuthology.orchestra.run.vm10:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-08T22:57:53.930 DEBUG:teuthology.orchestra.run.vm10:> sudo yum clean expire-cache 2026-03-08T22:57:54.086 INFO:teuthology.orchestra.run.vm10.stdout:Cache was expired 2026-03-08T22:57:54.086 INFO:teuthology.orchestra.run.vm10.stdout:0 files removed 2026-03-08T22:57:54.106 DEBUG:teuthology.parallel:result is None 2026-03-08T22:57:54.106 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm10.local 2026-03-08T22:57:54.106 DEBUG:teuthology.orchestra.run.vm10:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-08T22:57:54.134 DEBUG:teuthology.orchestra.run.vm10:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-08T22:57:54.203 DEBUG:teuthology.parallel:result is None 2026-03-08T22:57:54.203 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-08T22:57:54.205 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-08T22:57:54.205 DEBUG:teuthology.orchestra.run.vm10:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-08T22:57:54.258 INFO:teuthology.orchestra.run.vm10.stderr:bash: line 1: ntpq: command not found 2026-03-08T22:57:54.261 INFO:teuthology.orchestra.run.vm10.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-08T22:57:54.261 INFO:teuthology.orchestra.run.vm10.stdout:=============================================================================== 2026-03-08T22:57:54.261 INFO:teuthology.orchestra.run.vm10.stdout:^- stratum2-3.NTP.TechFak.U> 2 6 377 66 +1191us[+1188us] +/- 17ms 2026-03-08T22:57:54.261 INFO:teuthology.orchestra.run.vm10.stdout:^+ kronos.mailus.de 2 7 377 62 +147us[ +147us] +/- 46ms 2026-03-08T22:57:54.261 INFO:teuthology.orchestra.run.vm10.stdout:^* mail.light-speed.de 2 6 377 64 -116us[ -119us] +/- 18ms 2026-03-08T22:57:54.262 INFO:teuthology.orchestra.run.vm10.stdout:^+ pve2.h4x-gamers.top 2 7 377 125 +72us[ +69us] +/- 32ms 2026-03-08T22:57:54.262 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-08T22:57:54.270 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-08T22:57:54.271 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-08T22:57:54.273 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-08T22:57:54.276 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-08T22:57:54.281 INFO:teuthology.task.internal:Duration was 1336.182571 seconds 2026-03-08T22:57:54.281 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-08T22:57:54.284 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-08T22:57:54.284 DEBUG:teuthology.orchestra.run.vm10:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-08T22:57:54.340 INFO:teuthology.orchestra.run.vm10.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-08T22:57:54.593 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-08T22:57:54.593 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm10.local 2026-03-08T22:57:54.593 DEBUG:teuthology.orchestra.run.vm10:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-08T22:57:54.658 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-08T22:57:54.658 DEBUG:teuthology.orchestra.run.vm10:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-08T22:57:55.084 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-08T22:57:55.084 DEBUG:teuthology.orchestra.run.vm10:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-08T22:57:55.111 INFO:teuthology.orchestra.run.vm10.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-08T22:57:55.112 INFO:teuthology.orchestra.run.vm10.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-08T22:57:55.112 INFO:teuthology.orchestra.run.vm10.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-08T22:57:55.112 INFO:teuthology.orchestra.run.vm10.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-08T22:57:55.113 INFO:teuthology.orchestra.run.vm10.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: /home/ubuntu/cephtest/archive/syslog/journalctl.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-08T22:57:55.257 INFO:teuthology.orchestra.run.vm10.stderr: 98.5% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-08T22:57:55.259 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-08T22:57:55.261 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-08T22:57:55.261 DEBUG:teuthology.orchestra.run.vm10:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-08T22:57:55.321 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-08T22:57:55.323 DEBUG:teuthology.orchestra.run.vm10:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:57:55.389 INFO:teuthology.orchestra.run.vm10.stdout:kernel.core_pattern = core 2026-03-08T22:57:55.403 DEBUG:teuthology.orchestra.run.vm10:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:57:55.458 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:57:55.458 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-08T22:57:55.460 INFO:teuthology.task.internal:Transferring archived files... 2026-03-08T22:57:55.460 DEBUG:teuthology.misc:Transferring archived files from vm10:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps/275/remote/vm10 2026-03-08T22:57:55.461 DEBUG:teuthology.orchestra.run.vm10:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-08T22:57:55.530 INFO:teuthology.task.internal:Removing archive directory... 2026-03-08T22:57:55.530 DEBUG:teuthology.orchestra.run.vm10:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-08T22:57:55.585 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-08T22:57:55.588 INFO:teuthology.task.internal:Not uploading archives. 2026-03-08T22:57:55.588 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-08T22:57:55.590 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-08T22:57:55.590 DEBUG:teuthology.orchestra.run.vm10:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-08T22:57:55.641 INFO:teuthology.orchestra.run.vm10.stdout: 8532144 0 drwxr-xr-x 2 ubuntu ubuntu 6 Mar 8 22:57 /home/ubuntu/cephtest 2026-03-08T22:57:55.642 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-08T22:57:55.647 INFO:teuthology.run:Summary data: description: rados:standalone/{supported-random-distro$/{centos_latest} workloads/crush} duration: 1336.1825714111328 flavor: default owner: kyr success: true 2026-03-08T22:57:55.647 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-08T22:57:55.664 INFO:teuthology.run:pass