2026-03-31T18:58:54.821 INFO:root:teuthology version: 1.2.4.dev37+ga59626679 2026-03-31T18:58:54.825 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-31T18:58:54.846 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-31_11:18:10-rados-tentacle-none-default-vps/4338 branch: tentacle description: rados/standalone/{supported-random-distro$/{centos_latest} workloads/crush} email: null first_in_suite: false flavor: default job_id: '4338' last_in_suite: false machine_type: vps name: kyr-2026-03-31_11:18:10-rados-tentacle-none-default-vps no_nested_subset: false openstack: - volumes: count: 3 size: 10 os_type: centos os_version: 9.stream overrides: admin_socket: branch: tentacle ansible.cephlab: branch: main repo: https://github.com/kshtsk/ceph-cm-ansible.git skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: logical_volumes: lv_1: scratch_dev: true size: 25%VG vg: vg_nvme lv_2: scratch_dev: true size: 25%VG vg: vg_nvme lv_3: scratch_dev: true size: 25%VG vg: vg_nvme lv_4: scratch_dev: true size: 25%VG vg: vg_nvme timezone: UTC volume_groups: vg_nvme: pvs: /dev/vdb,/dev/vdc,/dev/vdd,/dev/vde ceph: conf: mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 osd: debug ms: 1 debug osd: 20 osd mclock iops capacity threshold hdd: 49000 flavor: default log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) sha1: 5bb3278730741031382ca9c3dc9d221a942e06a2 ceph-deploy: conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: {} cephadm: cephadm_binary_url: https://download.ceph.com/rpm-20.2.0/el9/noarch/cephadm install: ceph: flavor: default sha1: 5bb3278730741031382ca9c3dc9d221a942e06a2 extra_system_packages: deb: - python3-jmespath - python3-xmltodict - s3cmd rpm: - bzip2 - perl-Test-Harness - python3-jmespath - python3-xmltodict - s3cmd workunit: branch: tt-tentacle sha1: 0392f78529848ec72469e8e431875cb98d3a5fb4 owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - mon.a - mgr.x - osd.0 - osd.1 - osd.2 - client.0 seed: 6407 sha1: 5bb3278730741031382ca9c3dc9d221a942e06a2 sleep_before_teardown: 0 subset: 1/100000 suite: rados suite_branch: tt-tentacle suite_path: /home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 0392f78529848ec72469e8e431875cb98d3a5fb4 targets: vm05.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGxEjLj/wsS0+MCHXCoIFh7jNxil9D3KziVtFi/QbnRK8eBkVmRgOeITYjf5tUWaTfFw+QKac1HJdSjzsJp+Rsw= tasks: - install: null - workunit: basedir: qa/standalone clients: all: - crush teuthology: fragments_dropped: [] meta: {} postmerge: [] teuthology_branch: uv2 teuthology_repo: https://github.com/kshtsk/teuthology teuthology_sha1: a59626679648f962bca99d20d35578f2998c8f37 timestamp: 2026-03-31_11:18:10 tube: vps user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.282426 2026-03-31T18:58:54.846 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa; will attempt to use it 2026-03-31T18:58:54.847 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa/tasks 2026-03-31T18:58:54.847 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-31T18:58:54.847 INFO:teuthology.task.internal:Checking packages... 2026-03-31T18:58:54.847 INFO:teuthology.task.internal:Checking packages for os_type 'centos', flavor 'default' and ceph hash '5bb3278730741031382ca9c3dc9d221a942e06a2' 2026-03-31T18:58:54.847 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-31T18:58:54.847 INFO:teuthology.packaging:ref: None 2026-03-31T18:58:54.847 INFO:teuthology.packaging:tag: None 2026-03-31T18:58:54.847 INFO:teuthology.packaging:branch: tentacle 2026-03-31T18:58:54.847 INFO:teuthology.packaging:sha1: 5bb3278730741031382ca9c3dc9d221a942e06a2 2026-03-31T18:58:54.847 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=tentacle 2026-03-31T18:58:55.617 INFO:teuthology.task.internal:Found packages for ceph version 20.2.0-721.g5bb32787 2026-03-31T18:58:55.617 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-31T18:58:55.618 INFO:teuthology.task.internal:no buildpackages task found 2026-03-31T18:58:55.618 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-31T18:58:55.618 INFO:teuthology.task.internal:Saving configuration 2026-03-31T18:58:55.622 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-31T18:58:55.623 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-31T18:58:55.630 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm05.local', 'description': '/archive/kyr-2026-03-31_11:18:10-rados-tentacle-none-default-vps/4338', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-31 18:58:11.830316', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:05', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGxEjLj/wsS0+MCHXCoIFh7jNxil9D3KziVtFi/QbnRK8eBkVmRgOeITYjf5tUWaTfFw+QKac1HJdSjzsJp+Rsw='} 2026-03-31T18:58:55.630 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-31T18:58:55.631 INFO:teuthology.task.internal:roles: ubuntu@vm05.local - ['mon.a', 'mgr.x', 'osd.0', 'osd.1', 'osd.2', 'client.0'] 2026-03-31T18:58:55.631 INFO:teuthology.run_tasks:Running task console_log... 2026-03-31T18:58:55.638 DEBUG:teuthology.task.console_log:vm05 does not support IPMI; excluding 2026-03-31T18:58:55.638 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7f314a59e950>, signals=[15]) 2026-03-31T18:58:55.638 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-31T18:58:55.639 INFO:teuthology.task.internal:Opening connections... 2026-03-31T18:58:55.639 DEBUG:teuthology.task.internal:connecting to ubuntu@vm05.local 2026-03-31T18:58:55.639 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm05.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-31T18:58:55.701 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-31T18:58:55.702 DEBUG:teuthology.orchestra.run.vm05:> uname -m 2026-03-31T18:58:55.870 INFO:teuthology.orchestra.run.vm05.stdout:x86_64 2026-03-31T18:58:55.870 DEBUG:teuthology.orchestra.run.vm05:> cat /etc/os-release 2026-03-31T18:58:55.924 INFO:teuthology.orchestra.run.vm05.stdout:NAME="CentOS Stream" 2026-03-31T18:58:55.924 INFO:teuthology.orchestra.run.vm05.stdout:VERSION="9" 2026-03-31T18:58:55.924 INFO:teuthology.orchestra.run.vm05.stdout:ID="centos" 2026-03-31T18:58:55.924 INFO:teuthology.orchestra.run.vm05.stdout:ID_LIKE="rhel fedora" 2026-03-31T18:58:55.924 INFO:teuthology.orchestra.run.vm05.stdout:VERSION_ID="9" 2026-03-31T18:58:55.924 INFO:teuthology.orchestra.run.vm05.stdout:PLATFORM_ID="platform:el9" 2026-03-31T18:58:55.924 INFO:teuthology.orchestra.run.vm05.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-31T18:58:55.924 INFO:teuthology.orchestra.run.vm05.stdout:ANSI_COLOR="0;31" 2026-03-31T18:58:55.924 INFO:teuthology.orchestra.run.vm05.stdout:LOGO="fedora-logo-icon" 2026-03-31T18:58:55.924 INFO:teuthology.orchestra.run.vm05.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-31T18:58:55.924 INFO:teuthology.orchestra.run.vm05.stdout:HOME_URL="https://centos.org/" 2026-03-31T18:58:55.924 INFO:teuthology.orchestra.run.vm05.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-31T18:58:55.924 INFO:teuthology.orchestra.run.vm05.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-31T18:58:55.924 INFO:teuthology.orchestra.run.vm05.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-31T18:58:55.925 INFO:teuthology.lock.ops:Updating vm05.local on lock server 2026-03-31T18:58:55.929 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-31T18:58:55.931 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-31T18:58:55.931 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-31T18:58:55.932 DEBUG:teuthology.orchestra.run.vm05:> test '!' -e /home/ubuntu/cephtest 2026-03-31T18:58:55.978 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-31T18:58:55.979 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-31T18:58:55.979 DEBUG:teuthology.orchestra.run.vm05:> test -z $(ls -A /var/lib/ceph) 2026-03-31T18:58:56.032 INFO:teuthology.orchestra.run.vm05.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-31T18:58:56.032 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-31T18:58:56.040 DEBUG:teuthology.orchestra.run.vm05:> test -e /ceph-qa-ready 2026-03-31T18:58:56.085 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-31T18:58:56.282 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-31T18:58:56.284 INFO:teuthology.task.internal:Creating test directory... 2026-03-31T18:58:56.284 DEBUG:teuthology.orchestra.run.vm05:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-31T18:58:56.301 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-31T18:58:56.303 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-31T18:58:56.304 INFO:teuthology.task.internal:Creating archive directory... 2026-03-31T18:58:56.304 DEBUG:teuthology.orchestra.run.vm05:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-31T18:58:56.361 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-31T18:58:56.362 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-31T18:58:56.362 DEBUG:teuthology.orchestra.run.vm05:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-31T18:58:56.414 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-31T18:58:56.414 DEBUG:teuthology.orchestra.run.vm05:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-31T18:58:56.480 INFO:teuthology.orchestra.run.vm05.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T18:58:56.490 INFO:teuthology.orchestra.run.vm05.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T18:58:56.491 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-31T18:58:56.514 INFO:teuthology.task.internal:Configuring sudo... 2026-03-31T18:58:56.514 DEBUG:teuthology.orchestra.run.vm05:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-31T18:58:56.555 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-31T18:58:56.557 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-31T18:58:56.558 DEBUG:teuthology.orchestra.run.vm05:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-31T18:58:56.611 DEBUG:teuthology.orchestra.run.vm05:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-31T18:58:56.676 DEBUG:teuthology.orchestra.run.vm05:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-31T18:58:56.734 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-31T18:58:56.734 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-31T18:58:56.794 DEBUG:teuthology.orchestra.run.vm05:> sudo service rsyslog restart 2026-03-31T18:58:56.864 INFO:teuthology.orchestra.run.vm05.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-31T18:58:57.189 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-31T18:58:57.191 INFO:teuthology.task.internal:Starting timer... 2026-03-31T18:58:57.191 INFO:teuthology.run_tasks:Running task pcp... 2026-03-31T18:58:57.193 INFO:teuthology.run_tasks:Running task selinux... 2026-03-31T18:58:57.195 INFO:teuthology.task.selinux:Excluding vm05: VMs are not yet supported 2026-03-31T18:58:57.196 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-31T18:58:57.196 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-31T18:58:57.196 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-31T18:58:57.196 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-31T18:58:57.197 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'repo': 'https://github.com/kshtsk/ceph-cm-ansible.git', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'logical_volumes': {'lv_1': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_2': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_3': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_4': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}}, 'timezone': 'UTC', 'volume_groups': {'vg_nvme': {'pvs': '/dev/vdb,/dev/vdc,/dev/vdd,/dev/vde'}}}} 2026-03-31T18:58:57.197 DEBUG:teuthology.repo_utils:Setting repo remote to https://github.com/kshtsk/ceph-cm-ansible.git 2026-03-31T18:58:57.199 INFO:teuthology.repo_utils:Fetching github.com_kshtsk_ceph-cm-ansible_main from origin 2026-03-31T18:58:57.837 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_kshtsk_ceph-cm-ansible_main to origin/main 2026-03-31T18:58:57.842 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-31T18:58:57.843 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "logical_volumes": {"lv_1": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}, "lv_2": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}, "lv_3": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}, "lv_4": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}}, "timezone": "UTC", "volume_groups": {"vg_nvme": {"pvs": "/dev/vdb,/dev/vdc,/dev/vdd,/dev/vde"}}}' -i /tmp/teuth_ansible_inventoryqypb7_e_ --limit vm05.local /home/teuthos/src/github.com_kshtsk_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-31T19:00:27.429 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm05.local')] 2026-03-31T19:00:27.430 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm05.local' 2026-03-31T19:00:27.430 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm05.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-31T19:00:27.491 DEBUG:teuthology.orchestra.run.vm05:> true 2026-03-31T19:00:27.570 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm05.local' 2026-03-31T19:00:27.571 INFO:teuthology.run_tasks:Running task clock... 2026-03-31T19:00:27.573 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-31T19:00:27.573 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-31T19:00:27.574 DEBUG:teuthology.orchestra.run.vm05:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-31T19:00:27.641 INFO:teuthology.orchestra.run.vm05.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-31T19:00:27.657 INFO:teuthology.orchestra.run.vm05.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-31T19:00:27.682 INFO:teuthology.orchestra.run.vm05.stderr:sudo: ntpd: command not found 2026-03-31T19:00:27.693 INFO:teuthology.orchestra.run.vm05.stdout:506 Cannot talk to daemon 2026-03-31T19:00:27.711 INFO:teuthology.orchestra.run.vm05.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-31T19:00:27.726 INFO:teuthology.orchestra.run.vm05.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-31T19:00:27.773 INFO:teuthology.orchestra.run.vm05.stderr:bash: line 1: ntpq: command not found 2026-03-31T19:00:27.778 INFO:teuthology.orchestra.run.vm05.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-31T19:00:27.778 INFO:teuthology.orchestra.run.vm05.stdout:=============================================================================== 2026-03-31T19:00:27.778 INFO:teuthology.orchestra.run.vm05.stdout:^? nbg01.muxx.net 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-31T19:00:27.778 INFO:teuthology.orchestra.run.vm05.stdout:^? ntp2.uni-ulm.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-31T19:00:27.778 INFO:teuthology.orchestra.run.vm05.stdout:^? ovh.saclay.org 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-31T19:00:27.778 INFO:teuthology.orchestra.run.vm05.stdout:^? ntp.kernfusion.at 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-31T19:00:27.778 INFO:teuthology.run_tasks:Running task install... 2026-03-31T19:00:27.780 DEBUG:teuthology.task.install:project ceph 2026-03-31T19:00:27.780 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': '5bb3278730741031382ca9c3dc9d221a942e06a2'}, 'extra_system_packages': {'deb': ['python3-jmespath', 'python3-xmltodict', 's3cmd'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-jmespath', 'python3-xmltodict', 's3cmd']}} 2026-03-31T19:00:27.780 DEBUG:teuthology.task.install:config {'flavor': 'default', 'sha1': '5bb3278730741031382ca9c3dc9d221a942e06a2', 'extra_system_packages': {'deb': ['python3-jmespath', 'python3-xmltodict', 's3cmd'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-jmespath', 'python3-xmltodict', 's3cmd']}} 2026-03-31T19:00:27.780 INFO:teuthology.task.install:Using flavor: default 2026-03-31T19:00:27.783 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'ceph-volume', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'ceph-volume', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-31T19:00:27.783 INFO:teuthology.task.install:extra packages: [] 2026-03-31T19:00:27.783 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': None, 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': [], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-jmespath', 'python3-xmltodict', 's3cmd'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-jmespath', 'python3-xmltodict', 's3cmd']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': '5bb3278730741031382ca9c3dc9d221a942e06a2', 'tag': None, 'wait_for_package': False} 2026-03-31T19:00:27.783 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=5bb3278730741031382ca9c3dc9d221a942e06a2 2026-03-31T19:00:28.410 INFO:teuthology.task.install.rpm:Pulling from https://2.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/centos/9/flavors/default/ 2026-03-31T19:00:28.410 INFO:teuthology.task.install.rpm:Package version is 20.2.0-721.g5bb32787 2026-03-31T19:00:28.928 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://2.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://2.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://2.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-31T19:00:28.928 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-31T19:00:28.928 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-31T19:00:28.954 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, ceph-volume, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-jmespath, python3-xmltodict, s3cmd on remote rpm x86_64 2026-03-31T19:00:28.954 DEBUG:teuthology.orchestra.run.vm05:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-31T19:00:29.020 DEBUG:teuthology.orchestra.run.vm05:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-31T19:00:29.099 DEBUG:teuthology.orchestra.run.vm05:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-31T19:00:29.164 INFO:teuthology.orchestra.run.vm05.stdout:check_obsoletes = 1 2026-03-31T19:00:29.165 DEBUG:teuthology.orchestra.run.vm05:> sudo yum clean all 2026-03-31T19:00:29.344 INFO:teuthology.orchestra.run.vm05.stdout:41 files removed 2026-03-31T19:00:29.363 DEBUG:teuthology.orchestra.run.vm05:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse ceph-volume librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-jmespath python3-xmltodict s3cmd 2026-03-31T19:00:30.691 INFO:teuthology.orchestra.run.vm05.stdout:ceph packages for x86_64 77 kB/s | 89 kB 00:01 2026-03-31T19:00:31.770 INFO:teuthology.orchestra.run.vm05.stdout:ceph noarch packages 18 kB/s | 19 kB 00:01 2026-03-31T19:00:32.710 INFO:teuthology.orchestra.run.vm05.stdout:ceph source packages 2.1 kB/s | 1.9 kB 00:00 2026-03-31T19:00:34.889 INFO:teuthology.orchestra.run.vm05.stdout:CentOS Stream 9 - BaseOS 4.1 MB/s | 8.9 MB 00:02 2026-03-31T19:00:36.286 INFO:teuthology.orchestra.run.vm05.stdout:CentOS Stream 9 - AppStream 35 MB/s | 27 MB 00:00 2026-03-31T19:00:39.322 INFO:teuthology.orchestra.run.vm05.stdout:CentOS Stream 9 - CRB 19 MB/s | 8.0 MB 00:00 2026-03-31T19:00:40.900 INFO:teuthology.orchestra.run.vm05.stdout:CentOS Stream 9 - Extras packages 27 kB/s | 21 kB 00:00 2026-03-31T19:00:41.370 INFO:teuthology.orchestra.run.vm05.stdout:Extra Packages for Enterprise Linux 50 MB/s | 20 MB 00:00 2026-03-31T19:00:46.315 INFO:teuthology.orchestra.run.vm05.stdout:lab-extras 56 kB/s | 50 kB 00:00 2026-03-31T19:00:47.567 INFO:teuthology.orchestra.run.vm05.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-31T19:00:47.567 INFO:teuthology.orchestra.run.vm05.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-31T19:00:47.603 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:00:47.608 INFO:teuthology.orchestra.run.vm05.stdout:====================================================================================== 2026-03-31T19:00:47.608 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-31T19:00:47.608 INFO:teuthology.orchestra.run.vm05.stdout:====================================================================================== 2026-03-31T19:00:47.608 INFO:teuthology.orchestra.run.vm05.stdout:Installing: 2026-03-31T19:00:47.608 INFO:teuthology.orchestra.run.vm05.stdout: bzip2 x86_64 1.0.8-11.el9 baseos 55 k 2026-03-31T19:00:47.608 INFO:teuthology.orchestra.run.vm05.stdout: ceph x86_64 2:20.2.0-721.g5bb32787.el9 ceph 6.5 k 2026-03-31T19:00:47.608 INFO:teuthology.orchestra.run.vm05.stdout: ceph-base x86_64 2:20.2.0-721.g5bb32787.el9 ceph 5.9 M 2026-03-31T19:00:47.608 INFO:teuthology.orchestra.run.vm05.stdout: ceph-fuse x86_64 2:20.2.0-721.g5bb32787.el9 ceph 940 k 2026-03-31T19:00:47.608 INFO:teuthology.orchestra.run.vm05.stdout: ceph-immutable-object-cache x86_64 2:20.2.0-721.g5bb32787.el9 ceph 154 k 2026-03-31T19:00:47.608 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr x86_64 2:20.2.0-721.g5bb32787.el9 ceph 961 k 2026-03-31T19:00:47.608 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-cephadm noarch 2:20.2.0-721.g5bb32787.el9 ceph-noarch 173 k 2026-03-31T19:00:47.608 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-dashboard noarch 2:20.2.0-721.g5bb32787.el9 ceph-noarch 11 M 2026-03-31T19:00:47.608 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-diskprediction-local noarch 2:20.2.0-721.g5bb32787.el9 ceph-noarch 7.4 M 2026-03-31T19:00:47.608 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-rook noarch 2:20.2.0-721.g5bb32787.el9 ceph-noarch 50 k 2026-03-31T19:00:47.608 INFO:teuthology.orchestra.run.vm05.stdout: ceph-radosgw x86_64 2:20.2.0-721.g5bb32787.el9 ceph 24 M 2026-03-31T19:00:47.608 INFO:teuthology.orchestra.run.vm05.stdout: ceph-test x86_64 2:20.2.0-721.g5bb32787.el9 ceph 84 M 2026-03-31T19:00:47.608 INFO:teuthology.orchestra.run.vm05.stdout: ceph-volume noarch 2:20.2.0-721.g5bb32787.el9 ceph-noarch 298 k 2026-03-31T19:00:47.608 INFO:teuthology.orchestra.run.vm05.stdout: cephadm noarch 2:20.2.0-721.g5bb32787.el9 ceph-noarch 1.0 M 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-devel x86_64 2:20.2.0-721.g5bb32787.el9 ceph 34 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs2 x86_64 2:20.2.0-721.g5bb32787.el9 ceph 867 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: librados-devel x86_64 2:20.2.0-721.g5bb32787.el9 ceph 126 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: perl-Test-Harness noarch 1:3.42-461.el9 appstream 295 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: python3-cephfs x86_64 2:20.2.0-721.g5bb32787.el9 ceph 163 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: python3-rados x86_64 2:20.2.0-721.g5bb32787.el9 ceph 323 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: python3-rbd x86_64 2:20.2.0-721.g5bb32787.el9 ceph 304 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: python3-rgw x86_64 2:20.2.0-721.g5bb32787.el9 ceph 99 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: rbd-fuse x86_64 2:20.2.0-721.g5bb32787.el9 ceph 91 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: rbd-mirror x86_64 2:20.2.0-721.g5bb32787.el9 ceph 2.9 M 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: rbd-nbd x86_64 2:20.2.0-721.g5bb32787.el9 ceph 179 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: s3cmd noarch 2.4.0-1.el9 epel 206 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout:Upgrading: 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: librados2 x86_64 2:20.2.0-721.g5bb32787.el9 ceph 3.5 M 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: librbd1 x86_64 2:20.2.0-721.g5bb32787.el9 ceph 2.8 M 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout:Installing dependencies: 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: abseil-cpp x86_64 20211102.0-4.el9 epel 551 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: ceph-common x86_64 2:20.2.0-721.g5bb32787.el9 ceph 24 M 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: ceph-grafana-dashboards noarch 2:20.2.0-721.g5bb32787.el9 ceph-noarch 43 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mds x86_64 2:20.2.0-721.g5bb32787.el9 ceph 2.3 M 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core noarch 2:20.2.0-721.g5bb32787.el9 ceph-noarch 290 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mon x86_64 2:20.2.0-721.g5bb32787.el9 ceph 5.0 M 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: ceph-osd x86_64 2:20.2.0-721.g5bb32787.el9 ceph 17 M 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: ceph-prometheus-alerts noarch 2:20.2.0-721.g5bb32787.el9 ceph-noarch 17 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: ceph-selinux x86_64 2:20.2.0-721.g5bb32787.el9 ceph 25 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: cryptsetup x86_64 2.8.1-3.el9 baseos 351 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: fuse x86_64 2.9.9-17.el9 baseos 80 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: grpc-data noarch 1.46.7-10.el9 epel 19 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-proxy2 x86_64 2:20.2.0-721.g5bb32787.el9 ceph 24 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: libcephsqlite x86_64 2:20.2.0-721.g5bb32787.el9 ceph 164 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: libnbd x86_64 1.20.3-4.el9 appstream 164 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: libradosstriper1 x86_64 2:20.2.0-721.g5bb32787.el9 ceph 250 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: librgw2 x86_64 2:20.2.0-721.g5bb32787.el9 ceph 6.4 M 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: lua x86_64 5.4.4-4.el9 appstream 188 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: lua-devel x86_64 5.4.4-4.el9 crb 22 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: luarocks noarch 3.9.2-5.el9 epel 151 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-31T19:00:47.609 INFO:teuthology.orchestra.run.vm05.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: pciutils x86_64 3.7.0-7.el9 baseos 93 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: perl-Benchmark noarch 1.23-483.el9 appstream 26 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: protobuf x86_64 3.14.0-17.el9 appstream 1.0 M 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: protobuf-compiler x86_64 3.14.0-17.el9 crb 862 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse x86_64 2:20.2.0-721.g5bb32787.el9 ceph 45 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-common x86_64 2:20.2.0-721.g5bb32787.el9 ceph 175 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-cheroot noarch 10.0.1-5.el9 epel 173 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy noarch 18.10.0-5.el9 epel 290 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-grpcio x86_64 1.46.7-10.el9 epel 2.0 M 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-grpcio-tools x86_64 1.46.7-10.el9 epel 144 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-isodate noarch 0.6.1-3.el9 epel 56 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-lxml x86_64 4.6.5-3.el9 appstream 1.2 M 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-msgpack x86_64 1.0.3-2.el9 epel 86 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-packaging noarch 20.9-5.el9 appstream 77 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-protobuf noarch 3.14.0-17.el9 appstream 267 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyparsing noarch 2.4.7-9.el9 baseos 150 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-31T19:00:47.610 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: python3-xmlsec x86_64 1.3.13-1.el9 epel 48 k 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: qatlib x86_64 25.08.0-2.el9 appstream 240 k 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: qatzip-libs x86_64 1.3.1-1.el9 appstream 66 k 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: unzip x86_64 6.0-59.el9 baseos 182 k 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: xmlsec1 x86_64 1.2.29-13.el9 appstream 189 k 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: xmlsec1-openssl x86_64 1.2.29-13.el9 appstream 90 k 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: zip x86_64 3.0-35.el9 baseos 266 k 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout:Installing weak dependencies: 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-k8sevents noarch 2:20.2.0-721.g5bb32787.el9 ceph-noarch 22 k 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-daemon x86_64 2:20.2.0-721.g5bb32787.el9 ceph 35 k 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: nvme-cli x86_64 2.16-1.el9 baseos 1.2 M 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: python3-influxdb noarch 5.3.1-1.el9 epel 139 k 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: python3-saml noarch 1.16.0-1.el9 epel 125 k 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: qatlib-service x86_64 25.08.0-2.el9 appstream 37 k 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: smartmontools x86_64 1:7.2-10.el9 baseos 556 k 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout:====================================================================================== 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout:Install 148 Packages 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout:Upgrade 2 Packages 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout:Total download size: 270 M 2026-03-31T19:00:47.611 INFO:teuthology.orchestra.run.vm05.stdout:Downloading Packages: 2026-03-31T19:00:49.408 INFO:teuthology.orchestra.run.vm05.stdout:(1/150): ceph-20.2.0-721.g5bb32787.el9.x86_64.r 14 kB/s | 6.5 kB 00:00 2026-03-31T19:00:50.255 INFO:teuthology.orchestra.run.vm05.stdout:(2/150): ceph-fuse-20.2.0-721.g5bb32787.el9.x86 1.1 MB/s | 940 kB 00:00 2026-03-31T19:00:50.374 INFO:teuthology.orchestra.run.vm05.stdout:(3/150): ceph-immutable-object-cache-20.2.0-721 1.3 MB/s | 154 kB 00:00 2026-03-31T19:00:50.469 INFO:teuthology.orchestra.run.vm05.stdout:(4/150): ceph-base-20.2.0-721.g5bb32787.el9.x86 3.8 MB/s | 5.9 MB 00:01 2026-03-31T19:00:50.596 INFO:teuthology.orchestra.run.vm05.stdout:(5/150): ceph-mgr-20.2.0-721.g5bb32787.el9.x86_ 7.4 MB/s | 961 kB 00:00 2026-03-31T19:00:50.630 INFO:teuthology.orchestra.run.vm05.stdout:(6/150): ceph-mds-20.2.0-721.g5bb32787.el9.x86_ 9.2 MB/s | 2.3 MB 00:00 2026-03-31T19:00:50.969 INFO:teuthology.orchestra.run.vm05.stdout:(7/150): ceph-mon-20.2.0-721.g5bb32787.el9.x86_ 14 MB/s | 5.0 MB 00:00 2026-03-31T19:00:52.233 INFO:teuthology.orchestra.run.vm05.stdout:(8/150): ceph-common-20.2.0-721.g5bb32787.el9.x 7.2 MB/s | 24 MB 00:03 2026-03-31T19:00:52.349 INFO:teuthology.orchestra.run.vm05.stdout:(9/150): ceph-selinux-20.2.0-721.g5bb32787.el9. 216 kB/s | 25 kB 00:00 2026-03-31T19:00:52.987 INFO:teuthology.orchestra.run.vm05.stdout:(10/150): ceph-osd-20.2.0-721.g5bb32787.el9.x86 7.2 MB/s | 17 MB 00:02 2026-03-31T19:00:53.106 INFO:teuthology.orchestra.run.vm05.stdout:(11/150): libcephfs-daemon-20.2.0-721.g5bb32787 299 kB/s | 35 kB 00:00 2026-03-31T19:00:53.224 INFO:teuthology.orchestra.run.vm05.stdout:(12/150): libcephfs-devel-20.2.0-721.g5bb32787. 291 kB/s | 34 kB 00:00 2026-03-31T19:00:53.342 INFO:teuthology.orchestra.run.vm05.stdout:(13/150): libcephfs-proxy2-20.2.0-721.g5bb32787 205 kB/s | 24 kB 00:00 2026-03-31T19:00:54.649 INFO:teuthology.orchestra.run.vm05.stdout:(14/150): ceph-radosgw-20.2.0-721.g5bb32787.el9 6.4 MB/s | 24 MB 00:03 2026-03-31T19:00:54.770 INFO:teuthology.orchestra.run.vm05.stdout:(15/150): libcephsqlite-20.2.0-721.g5bb32787.el 1.3 MB/s | 164 kB 00:00 2026-03-31T19:00:54.889 INFO:teuthology.orchestra.run.vm05.stdout:(16/150): librados-devel-20.2.0-721.g5bb32787.e 1.0 MB/s | 126 kB 00:00 2026-03-31T19:00:54.905 INFO:teuthology.orchestra.run.vm05.stdout:(17/150): libcephfs2-20.2.0-721.g5bb32787.el9.x 555 kB/s | 867 kB 00:01 2026-03-31T19:00:55.009 INFO:teuthology.orchestra.run.vm05.stdout:(18/150): libradosstriper1-20.2.0-721.g5bb32787 2.0 MB/s | 250 kB 00:00 2026-03-31T19:00:55.125 INFO:teuthology.orchestra.run.vm05.stdout:(19/150): python3-ceph-argparse-20.2.0-721.g5bb 388 kB/s | 45 kB 00:00 2026-03-31T19:00:55.244 INFO:teuthology.orchestra.run.vm05.stdout:(20/150): python3-ceph-common-20.2.0-721.g5bb32 1.4 MB/s | 175 kB 00:00 2026-03-31T19:00:55.363 INFO:teuthology.orchestra.run.vm05.stdout:(21/150): python3-cephfs-20.2.0-721.g5bb32787.e 1.3 MB/s | 163 kB 00:00 2026-03-31T19:00:55.483 INFO:teuthology.orchestra.run.vm05.stdout:(22/150): python3-rados-20.2.0-721.g5bb32787.el 2.6 MB/s | 323 kB 00:00 2026-03-31T19:00:55.603 INFO:teuthology.orchestra.run.vm05.stdout:(23/150): python3-rbd-20.2.0-721.g5bb32787.el9. 2.5 MB/s | 304 kB 00:00 2026-03-31T19:00:55.721 INFO:teuthology.orchestra.run.vm05.stdout:(24/150): python3-rgw-20.2.0-721.g5bb32787.el9. 846 kB/s | 99 kB 00:00 2026-03-31T19:00:55.838 INFO:teuthology.orchestra.run.vm05.stdout:(25/150): rbd-fuse-20.2.0-721.g5bb32787.el9.x86 774 kB/s | 91 kB 00:00 2026-03-31T19:00:55.979 INFO:teuthology.orchestra.run.vm05.stdout:(26/150): librgw2-20.2.0-721.g5bb32787.el9.x86_ 5.9 MB/s | 6.4 MB 00:01 2026-03-31T19:00:56.099 INFO:teuthology.orchestra.run.vm05.stdout:(27/150): rbd-nbd-20.2.0-721.g5bb32787.el9.x86_ 1.5 MB/s | 179 kB 00:00 2026-03-31T19:00:56.218 INFO:teuthology.orchestra.run.vm05.stdout:(28/150): ceph-grafana-dashboards-20.2.0-721.g5 365 kB/s | 43 kB 00:00 2026-03-31T19:00:56.354 INFO:teuthology.orchestra.run.vm05.stdout:(29/150): rbd-mirror-20.2.0-721.g5bb32787.el9.x 5.7 MB/s | 2.9 MB 00:00 2026-03-31T19:00:56.364 INFO:teuthology.orchestra.run.vm05.stdout:(30/150): ceph-mgr-cephadm-20.2.0-721.g5bb32787 1.2 MB/s | 173 kB 00:00 2026-03-31T19:00:57.551 INFO:teuthology.orchestra.run.vm05.stdout:(31/150): ceph-mgr-diskprediction-local-20.2.0- 6.2 MB/s | 7.4 MB 00:01 2026-03-31T19:00:57.669 INFO:teuthology.orchestra.run.vm05.stdout:(32/150): ceph-mgr-k8sevents-20.2.0-721.g5bb327 187 kB/s | 22 kB 00:00 2026-03-31T19:00:57.790 INFO:teuthology.orchestra.run.vm05.stdout:(33/150): ceph-mgr-modules-core-20.2.0-721.g5bb 2.3 MB/s | 290 kB 00:00 2026-03-31T19:00:57.876 INFO:teuthology.orchestra.run.vm05.stdout:(34/150): ceph-mgr-dashboard-20.2.0-721.g5bb327 7.0 MB/s | 11 MB 00:01 2026-03-31T19:00:57.909 INFO:teuthology.orchestra.run.vm05.stdout:(35/150): ceph-mgr-rook-20.2.0-721.g5bb32787.el 422 kB/s | 50 kB 00:00 2026-03-31T19:00:57.996 INFO:teuthology.orchestra.run.vm05.stdout:(36/150): ceph-prometheus-alerts-20.2.0-721.g5b 146 kB/s | 17 kB 00:00 2026-03-31T19:00:58.030 INFO:teuthology.orchestra.run.vm05.stdout:(37/150): ceph-volume-20.2.0-721.g5bb32787.el9. 2.4 MB/s | 298 kB 00:00 2026-03-31T19:00:58.194 INFO:teuthology.orchestra.run.vm05.stdout:(38/150): bzip2-1.0.8-11.el9.x86_64.rpm 335 kB/s | 55 kB 00:00 2026-03-31T19:00:58.229 INFO:teuthology.orchestra.run.vm05.stdout:(39/150): cephadm-20.2.0-721.g5bb32787.el9.noar 4.3 MB/s | 1.0 MB 00:00 2026-03-31T19:00:58.401 INFO:teuthology.orchestra.run.vm05.stdout:(40/150): fuse-2.9.9-17.el9.x86_64.rpm 464 kB/s | 80 kB 00:00 2026-03-31T19:00:58.445 INFO:teuthology.orchestra.run.vm05.stdout:(41/150): cryptsetup-2.8.1-3.el9.x86_64.rpm 1.4 MB/s | 351 kB 00:00 2026-03-31T19:00:58.452 INFO:teuthology.orchestra.run.vm05.stdout:(42/150): ledmon-libs-1.1.0-3.el9.x86_64.rpm 796 kB/s | 40 kB 00:00 2026-03-31T19:00:58.501 INFO:teuthology.orchestra.run.vm05.stdout:(43/150): libconfig-1.7.2-9.el9.x86_64.rpm 1.3 MB/s | 72 kB 00:00 2026-03-31T19:00:58.607 INFO:teuthology.orchestra.run.vm05.stdout:(44/150): libquadmath-11.5.0-14.el9.x86_64.rpm 1.7 MB/s | 184 kB 00:00 2026-03-31T19:00:58.642 INFO:teuthology.orchestra.run.vm05.stdout:(45/150): mailcap-2.1.49-5.el9.noarch.rpm 962 kB/s | 33 kB 00:00 2026-03-31T19:00:58.693 INFO:teuthology.orchestra.run.vm05.stdout:(46/150): libgfortran-11.5.0-14.el9.x86_64.rpm 3.2 MB/s | 794 kB 00:00 2026-03-31T19:00:58.767 INFO:teuthology.orchestra.run.vm05.stdout:(47/150): pciutils-3.7.0-7.el9.x86_64.rpm 1.2 MB/s | 93 kB 00:00 2026-03-31T19:00:58.845 INFO:teuthology.orchestra.run.vm05.stdout:(48/150): python3-cffi-1.14.5-5.el9.x86_64.rpm 3.2 MB/s | 253 kB 00:00 2026-03-31T19:00:58.852 INFO:teuthology.orchestra.run.vm05.stdout:(49/150): nvme-cli-2.16-1.el9.x86_64.rpm 5.5 MB/s | 1.2 MB 00:00 2026-03-31T19:00:58.915 INFO:teuthology.orchestra.run.vm05.stdout:(50/150): python3-ply-3.11-14.el9.noarch.rpm 1.6 MB/s | 106 kB 00:00 2026-03-31T19:00:58.986 INFO:teuthology.orchestra.run.vm05.stdout:(51/150): python3-pycparser-2.20-6.el9.noarch.r 1.9 MB/s | 135 kB 00:00 2026-03-31T19:00:59.030 INFO:teuthology.orchestra.run.vm05.stdout:(52/150): python3-cryptography-36.0.1-5.el9.x86 6.7 MB/s | 1.2 MB 00:00 2026-03-31T19:00:59.046 INFO:teuthology.orchestra.run.vm05.stdout:(53/150): python3-pyparsing-2.4.7-9.el9.noarch. 2.5 MB/s | 150 kB 00:00 2026-03-31T19:00:59.103 INFO:teuthology.orchestra.run.vm05.stdout:(54/150): python3-requests-2.25.1-10.el9.noarch 1.7 MB/s | 126 kB 00:00 2026-03-31T19:00:59.114 INFO:teuthology.orchestra.run.vm05.stdout:(55/150): python3-urllib3-1.26.5-7.el9.noarch.r 3.1 MB/s | 218 kB 00:00 2026-03-31T19:00:59.188 INFO:teuthology.orchestra.run.vm05.stdout:(56/150): unzip-6.0-59.el9.x86_64.rpm 2.4 MB/s | 182 kB 00:00 2026-03-31T19:00:59.194 INFO:teuthology.orchestra.run.vm05.stdout:(57/150): smartmontools-7.2-10.el9.x86_64.rpm 6.0 MB/s | 556 kB 00:00 2026-03-31T19:00:59.249 INFO:teuthology.orchestra.run.vm05.stdout:(58/150): zip-3.0-35.el9.x86_64.rpm 4.3 MB/s | 266 kB 00:00 2026-03-31T19:00:59.341 INFO:teuthology.orchestra.run.vm05.stdout:(59/150): boost-program-options-1.75.0-13.el9.x 706 kB/s | 104 kB 00:00 2026-03-31T19:00:59.443 INFO:teuthology.orchestra.run.vm05.stdout:(60/150): flexiblas-3.0.4-9.el9.x86_64.rpm 152 kB/s | 30 kB 00:00 2026-03-31T19:00:59.518 INFO:teuthology.orchestra.run.vm05.stdout:(61/150): flexiblas-openblas-openmp-3.0.4-9.el9 198 kB/s | 15 kB 00:00 2026-03-31T19:00:59.589 INFO:teuthology.orchestra.run.vm05.stdout:(62/150): flexiblas-netlib-3.0.4-9.el9.x86_64.r 12 MB/s | 3.0 MB 00:00 2026-03-31T19:00:59.629 INFO:teuthology.orchestra.run.vm05.stdout:(63/150): libnbd-1.20.3-4.el9.x86_64.rpm 1.4 MB/s | 164 kB 00:00 2026-03-31T19:00:59.633 INFO:teuthology.orchestra.run.vm05.stdout:(64/150): libpmemobj-1.12.1-1.el9.x86_64.rpm 3.6 MB/s | 160 kB 00:00 2026-03-31T19:00:59.660 INFO:teuthology.orchestra.run.vm05.stdout:(65/150): librabbitmq-0.11.0-7.el9.x86_64.rpm 1.4 MB/s | 45 kB 00:00 2026-03-31T19:00:59.671 INFO:teuthology.orchestra.run.vm05.stdout:(66/150): librdkafka-1.6.1-102.el9.x86_64.rpm 17 MB/s | 662 kB 00:00 2026-03-31T19:00:59.741 INFO:teuthology.orchestra.run.vm05.stdout:(67/150): libxslt-1.1.34-12.el9.x86_64.rpm 3.3 MB/s | 233 kB 00:00 2026-03-31T19:00:59.766 INFO:teuthology.orchestra.run.vm05.stdout:(68/150): libstoragemgmt-1.10.1-1.el9.x86_64.rp 2.3 MB/s | 246 kB 00:00 2026-03-31T19:00:59.845 INFO:teuthology.orchestra.run.vm05.stdout:(69/150): lttng-ust-2.12.0-6.el9.x86_64.rpm 2.8 MB/s | 292 kB 00:00 2026-03-31T19:01:00.509 INFO:teuthology.orchestra.run.vm05.stdout:(70/150): lua-5.4.4-4.el9.x86_64.rpm 254 kB/s | 188 kB 00:00 2026-03-31T19:01:00.589 INFO:teuthology.orchestra.run.vm05.stdout:(71/150): openblas-0.3.29-1.el9.x86_64.rpm 57 kB/s | 42 kB 00:00 2026-03-31T19:01:00.949 INFO:teuthology.orchestra.run.vm05.stdout:(72/150): perl-Benchmark-1.23-483.el9.noarch.rp 73 kB/s | 26 kB 00:00 2026-03-31T19:01:00.984 INFO:teuthology.orchestra.run.vm05.stdout:(73/150): perl-Test-Harness-3.42-461.el9.noarch 8.4 MB/s | 295 kB 00:00 2026-03-31T19:01:01.050 INFO:teuthology.orchestra.run.vm05.stdout:(74/150): protobuf-3.14.0-17.el9.x86_64.rpm 15 MB/s | 1.0 MB 00:00 2026-03-31T19:01:01.085 INFO:teuthology.orchestra.run.vm05.stdout:(75/150): openblas-openmp-0.3.29-1.el9.x86_64.r 9.2 MB/s | 5.3 MB 00:00 2026-03-31T19:01:01.485 INFO:teuthology.orchestra.run.vm05.stdout:(76/150): python3-devel-3.9.25-3.el9.x86_64.rpm 611 kB/s | 244 kB 00:00 2026-03-31T19:01:01.633 INFO:teuthology.orchestra.run.vm05.stdout:(77/150): python3-jinja2-2.11.3-8.el9.noarch.rp 1.6 MB/s | 249 kB 00:00 2026-03-31T19:01:01.663 INFO:teuthology.orchestra.run.vm05.stdout:(78/150): python3-babel-2.9.1-2.el9.noarch.rpm 9.7 MB/s | 6.0 MB 00:00 2026-03-31T19:01:01.682 INFO:teuthology.orchestra.run.vm05.stdout:(79/150): python3-jmespath-1.0.1-1.el9.noarch.r 977 kB/s | 48 kB 00:00 2026-03-31T19:01:01.697 INFO:teuthology.orchestra.run.vm05.stdout:(80/150): python3-libstoragemgmt-1.10.1-1.el9.x 5.2 MB/s | 177 kB 00:00 2026-03-31T19:01:01.941 INFO:teuthology.orchestra.run.vm05.stdout:(81/150): python3-markupsafe-1.1.1-12.el9.x86_6 142 kB/s | 35 kB 00:00 2026-03-31T19:01:01.950 INFO:teuthology.orchestra.run.vm05.stdout:(82/150): python3-lxml-4.6.5-3.el9.x86_64.rpm 4.6 MB/s | 1.2 MB 00:00 2026-03-31T19:01:02.162 INFO:teuthology.orchestra.run.vm05.stdout:(83/150): python3-numpy-f2py-1.23.5-2.el9.x86_6 2.0 MB/s | 442 kB 00:00 2026-03-31T19:01:02.213 INFO:teuthology.orchestra.run.vm05.stdout:(84/150): python3-packaging-20.9-5.el9.noarch.r 1.5 MB/s | 77 kB 00:00 2026-03-31T19:01:02.249 INFO:teuthology.orchestra.run.vm05.stdout:(85/150): python3-protobuf-3.14.0-17.el9.noarch 7.3 MB/s | 267 kB 00:00 2026-03-31T19:01:02.333 INFO:teuthology.orchestra.run.vm05.stdout:(86/150): python3-numpy-1.23.5-2.el9.x86_64.rpm 16 MB/s | 6.1 MB 00:00 2026-03-31T19:01:02.459 INFO:teuthology.orchestra.run.vm05.stdout:(87/150): python3-pyasn1-0.4.8-7.el9.noarch.rpm 752 kB/s | 157 kB 00:00 2026-03-31T19:01:02.475 INFO:teuthology.orchestra.run.vm05.stdout:(88/150): python3-pyasn1-modules-0.4.8-7.el9.no 1.9 MB/s | 277 kB 00:00 2026-03-31T19:01:02.520 INFO:teuthology.orchestra.run.vm05.stdout:(89/150): python3-requests-oauthlib-1.3.0-12.el 872 kB/s | 54 kB 00:00 2026-03-31T19:01:02.733 INFO:teuthology.orchestra.run.vm05.stdout:(90/150): python3-toml-0.10.2-6.el9.noarch.rpm 195 kB/s | 42 kB 00:00 2026-03-31T19:01:02.768 INFO:teuthology.orchestra.run.vm05.stdout:(91/150): qatlib-25.08.0-2.el9.x86_64.rpm 6.7 MB/s | 240 kB 00:00 2026-03-31T19:01:02.880 INFO:teuthology.orchestra.run.vm05.stdout:(92/150): qatlib-service-25.08.0-2.el9.x86_64.r 334 kB/s | 37 kB 00:00 2026-03-31T19:01:02.912 INFO:teuthology.orchestra.run.vm05.stdout:(93/150): qatzip-libs-1.3.1-1.el9.x86_64.rpm 2.0 MB/s | 66 kB 00:00 2026-03-31T19:01:02.946 INFO:teuthology.orchestra.run.vm05.stdout:(94/150): socat-1.7.4.1-8.el9.x86_64.rpm 8.6 MB/s | 303 kB 00:00 2026-03-31T19:01:03.007 INFO:teuthology.orchestra.run.vm05.stdout:(95/150): xmlsec1-1.2.29-13.el9.x86_64.rpm 3.1 MB/s | 189 kB 00:00 2026-03-31T19:01:03.060 INFO:teuthology.orchestra.run.vm05.stdout:(96/150): xmlsec1-openssl-1.2.29-13.el9.x86_64. 1.7 MB/s | 90 kB 00:00 2026-03-31T19:01:03.107 INFO:teuthology.orchestra.run.vm05.stdout:(97/150): xmlstarlet-1.6.1-20.el9.x86_64.rpm 1.3 MB/s | 64 kB 00:00 2026-03-31T19:01:03.288 INFO:teuthology.orchestra.run.vm05.stdout:(98/150): lua-devel-5.4.4-4.el9.x86_64.rpm 123 kB/s | 22 kB 00:00 2026-03-31T19:01:03.640 INFO:teuthology.orchestra.run.vm05.stdout:(99/150): python3-scipy-1.9.3-2.el9.x86_64.rpm 17 MB/s | 19 MB 00:01 2026-03-31T19:01:03.653 INFO:teuthology.orchestra.run.vm05.stdout:(100/150): abseil-cpp-20211102.0-4.el9.x86_64.r 42 MB/s | 551 kB 00:00 2026-03-31T19:01:03.660 INFO:teuthology.orchestra.run.vm05.stdout:(101/150): gperftools-libs-2.9.1-3.el9.x86_64.r 46 MB/s | 308 kB 00:00 2026-03-31T19:01:03.663 INFO:teuthology.orchestra.run.vm05.stdout:(102/150): grpc-data-1.46.7-10.el9.noarch.rpm 6.2 MB/s | 19 kB 00:00 2026-03-31T19:01:03.719 INFO:teuthology.orchestra.run.vm05.stdout:(103/150): libarrow-9.0.0-15.el9.x86_64.rpm 80 MB/s | 4.4 MB 00:00 2026-03-31T19:01:03.722 INFO:teuthology.orchestra.run.vm05.stdout:(104/150): libarrow-doc-9.0.0-15.el9.noarch.rpm 7.8 MB/s | 25 kB 00:00 2026-03-31T19:01:03.725 INFO:teuthology.orchestra.run.vm05.stdout:(105/150): liboath-2.6.12-1.el9.x86_64.rpm 14 MB/s | 49 kB 00:00 2026-03-31T19:01:03.729 INFO:teuthology.orchestra.run.vm05.stdout:(106/150): libunwind-1.6.2-1.el9.x86_64.rpm 17 MB/s | 67 kB 00:00 2026-03-31T19:01:03.734 INFO:teuthology.orchestra.run.vm05.stdout:(107/150): luarocks-3.9.2-5.el9.noarch.rpm 33 MB/s | 151 kB 00:00 2026-03-31T19:01:03.746 INFO:teuthology.orchestra.run.vm05.stdout:(108/150): parquet-libs-9.0.0-15.el9.x86_64.rpm 68 MB/s | 838 kB 00:00 2026-03-31T19:01:03.755 INFO:teuthology.orchestra.run.vm05.stdout:(109/150): python3-asyncssh-2.13.2-5.el9.noarch 62 MB/s | 548 kB 00:00 2026-03-31T19:01:03.938 INFO:teuthology.orchestra.run.vm05.stdout:(110/150): ceph-test-20.2.0-721.g5bb32787.el9.x 7.2 MB/s | 84 MB 00:11 2026-03-31T19:01:03.940 INFO:teuthology.orchestra.run.vm05.stdout:(111/150): python3-autocommand-2.2.2-8.el9.noar 160 kB/s | 29 kB 00:00 2026-03-31T19:01:03.942 INFO:teuthology.orchestra.run.vm05.stdout:(112/150): protobuf-compiler-3.14.0-17.el9.x86_ 1.3 MB/s | 862 kB 00:00 2026-03-31T19:01:03.943 INFO:teuthology.orchestra.run.vm05.stdout:(113/150): python3-bcrypt-3.2.2-1.el9.x86_64.rp 13 MB/s | 43 kB 00:00 2026-03-31T19:01:03.945 INFO:teuthology.orchestra.run.vm05.stdout:(114/150): python3-backports-tarfile-1.2.0-1.el 9.2 MB/s | 60 kB 00:00 2026-03-31T19:01:03.946 INFO:teuthology.orchestra.run.vm05.stdout:(115/150): python3-certifi-2023.05.07-4.el9.noa 6.1 MB/s | 14 kB 00:00 2026-03-31T19:01:03.947 INFO:teuthology.orchestra.run.vm05.stdout:(116/150): python3-cachetools-4.2.4-1.el9.noarc 7.3 MB/s | 32 kB 00:00 2026-03-31T19:01:03.952 INFO:teuthology.orchestra.run.vm05.stdout:(117/150): python3-cherrypy-18.10.0-5.el9.noarc 47 MB/s | 290 kB 00:00 2026-03-31T19:01:03.954 INFO:teuthology.orchestra.run.vm05.stdout:(118/150): python3-cheroot-10.0.1-5.el9.noarch. 19 MB/s | 173 kB 00:00 2026-03-31T19:01:03.962 INFO:teuthology.orchestra.run.vm05.stdout:(119/150): python3-grpcio-tools-1.46.7-10.el9.x 17 MB/s | 144 kB 00:00 2026-03-31T19:01:03.970 INFO:teuthology.orchestra.run.vm05.stdout:(120/150): python3-influxdb-5.3.1-1.el9.noarch. 20 MB/s | 139 kB 00:00 2026-03-31T19:01:03.975 INFO:teuthology.orchestra.run.vm05.stdout:(121/150): python3-isodate-0.6.1-3.el9.noarch.r 10 MB/s | 56 kB 00:00 2026-03-31T19:01:03.978 INFO:teuthology.orchestra.run.vm05.stdout:(122/150): python3-google-auth-2.45.0-1.el9.noa 8.2 MB/s | 254 kB 00:00 2026-03-31T19:01:03.982 INFO:teuthology.orchestra.run.vm05.stdout:(123/150): python3-grpcio-1.46.7-10.el9.x86_64. 69 MB/s | 2.0 MB 00:00 2026-03-31T19:01:03.982 INFO:teuthology.orchestra.run.vm05.stdout:(124/150): python3-jaraco-8.2.1-3.el9.noarch.rp 1.5 MB/s | 11 kB 00:00 2026-03-31T19:01:03.983 INFO:teuthology.orchestra.run.vm05.stdout:(125/150): python3-jaraco-classes-3.2.1-5.el9.n 3.3 MB/s | 18 kB 00:00 2026-03-31T19:01:03.986 INFO:teuthology.orchestra.run.vm05.stdout:(126/150): python3-jaraco-collections-3.0.0-8.e 7.5 MB/s | 23 kB 00:00 2026-03-31T19:01:03.986 INFO:teuthology.orchestra.run.vm05.stdout:(127/150): python3-jaraco-context-6.0.1-3.el9.n 5.3 MB/s | 20 kB 00:00 2026-03-31T19:01:03.987 INFO:teuthology.orchestra.run.vm05.stdout:(128/150): python3-jaraco-functools-3.5.0-2.el9 5.4 MB/s | 19 kB 00:00 2026-03-31T19:01:03.988 INFO:teuthology.orchestra.run.vm05.stdout:(129/150): python3-jaraco-text-4.0.0-2.el9.noar 10 MB/s | 26 kB 00:00 2026-03-31T19:01:03.995 INFO:teuthology.orchestra.run.vm05.stdout:(130/150): python3-msgpack-1.0.3-2.el9.x86_64.r 14 MB/s | 86 kB 00:00 2026-03-31T19:01:04.001 INFO:teuthology.orchestra.run.vm05.stdout:(131/150): python3-natsort-7.1.1-5.el9.noarch.r 8.9 MB/s | 58 kB 00:00 2026-03-31T19:01:04.003 INFO:teuthology.orchestra.run.vm05.stdout:(132/150): python3-more-itertools-8.12.0-2.el9. 4.9 MB/s | 79 kB 00:00 2026-03-31T19:01:04.007 INFO:teuthology.orchestra.run.vm05.stdout:(133/150): python3-kubernetes-26.1.0-3.el9.noar 51 MB/s | 1.0 MB 00:00 2026-03-31T19:01:04.007 INFO:teuthology.orchestra.run.vm05.stdout:(134/150): python3-portend-3.1.0-2.el9.noarch.r 2.9 MB/s | 16 kB 00:00 2026-03-31T19:01:04.009 INFO:teuthology.orchestra.run.vm05.stdout:(135/150): python3-repoze-lru-0.7-16.el9.noarch 13 MB/s | 31 kB 00:00 2026-03-31T19:01:04.010 INFO:teuthology.orchestra.run.vm05.stdout:(136/150): python3-pyOpenSSL-21.0.0-1.el9.noarc 12 MB/s | 90 kB 00:00 2026-03-31T19:01:04.012 INFO:teuthology.orchestra.run.vm05.stdout:(137/150): python3-routes-2.5.1-5.el9.noarch.rp 39 MB/s | 188 kB 00:00 2026-03-31T19:01:04.014 INFO:teuthology.orchestra.run.vm05.stdout:(138/150): python3-rsa-4.9-2.el9.noarch.rpm 14 MB/s | 59 kB 00:00 2026-03-31T19:01:04.016 INFO:teuthology.orchestra.run.vm05.stdout:(139/150): python3-tempora-5.0.0-2.el9.noarch.r 11 MB/s | 36 kB 00:00 2026-03-31T19:01:04.017 INFO:teuthology.orchestra.run.vm05.stdout:(140/150): python3-typing-extensions-4.15.0-1.e 23 MB/s | 86 kB 00:00 2026-03-31T19:01:04.019 INFO:teuthology.orchestra.run.vm05.stdout:(141/150): python3-saml-1.16.0-1.el9.noarch.rpm 15 MB/s | 125 kB 00:00 2026-03-31T19:01:04.020 INFO:teuthology.orchestra.run.vm05.stdout:(142/150): python3-websocket-client-1.2.3-2.el9 21 MB/s | 90 kB 00:00 2026-03-31T19:01:04.021 INFO:teuthology.orchestra.run.vm05.stdout:(143/150): python3-xmlsec-1.3.13-1.el9.x86_64.r 13 MB/s | 48 kB 00:00 2026-03-31T19:01:04.022 INFO:teuthology.orchestra.run.vm05.stdout:(144/150): python3-xmltodict-0.12.0-15.el9.noar 7.7 MB/s | 22 kB 00:00 2026-03-31T19:01:04.022 INFO:teuthology.orchestra.run.vm05.stdout:(145/150): python3-zc-lockfile-2.0-10.el9.noarc 9.6 MB/s | 20 kB 00:00 2026-03-31T19:01:04.026 INFO:teuthology.orchestra.run.vm05.stdout:(146/150): re2-20211101-20.el9.x86_64.rpm 36 MB/s | 191 kB 00:00 2026-03-31T19:01:04.042 INFO:teuthology.orchestra.run.vm05.stdout:(147/150): s3cmd-2.4.0-1.el9.noarch.rpm 10 MB/s | 206 kB 00:00 2026-03-31T19:01:04.045 INFO:teuthology.orchestra.run.vm05.stdout:(148/150): thrift-0.15.0-4.el9.x86_64.rpm 70 MB/s | 1.6 MB 00:00 2026-03-31T19:01:05.097 INFO:teuthology.orchestra.run.vm05.stdout:(149/150): librbd1-20.2.0-721.g5bb32787.el9.x86 2.7 MB/s | 2.8 MB 00:01 2026-03-31T19:01:05.207 INFO:teuthology.orchestra.run.vm05.stdout:(150/150): librados2-20.2.0-721.g5bb32787.el9.x 3.0 MB/s | 3.5 MB 00:01 2026-03-31T19:01:05.209 INFO:teuthology.orchestra.run.vm05.stdout:-------------------------------------------------------------------------------- 2026-03-31T19:01:05.209 INFO:teuthology.orchestra.run.vm05.stdout:Total 15 MB/s | 270 MB 00:17 2026-03-31T19:01:05.833 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-31T19:01:05.894 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-31T19:01:05.894 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-31T19:01:06.895 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-31T19:01:06.895 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-31T19:01:07.972 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-31T19:01:07.980 INFO:teuthology.orchestra.run.vm05.stdout: Installing : thrift-0.15.0-4.el9.x86_64 1/152 2026-03-31T19:01:07.983 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 2/152 2026-03-31T19:01:07.994 INFO:teuthology.orchestra.run.vm05.stdout: Installing : liboath-2.6.12-1.el9.x86_64 3/152 2026-03-31T19:01:08.166 INFO:teuthology.orchestra.run.vm05.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 4/152 2026-03-31T19:01:08.167 INFO:teuthology.orchestra.run.vm05.stdout: Upgrading : librados2-2:20.2.0-721.g5bb32787.el9.x86_64 5/152 2026-03-31T19:01:08.224 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librados2-2:20.2.0-721.g5bb32787.el9.x86_64 5/152 2026-03-31T19:01:08.226 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libcephfs2-2:20.2.0-721.g5bb32787.el9.x86_64 6/152 2026-03-31T19:01:08.242 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libcephfs2-2:20.2.0-721.g5bb32787.el9.x86_64 6/152 2026-03-31T19:01:08.245 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libcephfs-daemon-2:20.2.0-721.g5bb32787.el9.x86_ 7/152 2026-03-31T19:01:08.246 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libcephfs-proxy2-2:20.2.0-721.g5bb32787.el9.x86_ 8/152 2026-03-31T19:01:08.275 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libcephfs-proxy2-2:20.2.0-721.g5bb32787.el9.x86_ 8/152 2026-03-31T19:01:08.282 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-rados-2:20.2.0-721.g5bb32787.el9.x86_64 9/152 2026-03-31T19:01:08.293 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 10/152 2026-03-31T19:01:08.296 INFO:teuthology.orchestra.run.vm05.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 11/152 2026-03-31T19:01:08.299 INFO:teuthology.orchestra.run.vm05.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 12/152 2026-03-31T19:01:08.302 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 13/152 2026-03-31T19:01:08.308 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 14/152 2026-03-31T19:01:08.454 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libnbd-1.20.3-4.el9.x86_64 15/152 2026-03-31T19:01:08.455 INFO:teuthology.orchestra.run.vm05.stdout: Upgrading : librbd1-2:20.2.0-721.g5bb32787.el9.x86_64 16/152 2026-03-31T19:01:08.506 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librbd1-2:20.2.0-721.g5bb32787.el9.x86_64 16/152 2026-03-31T19:01:08.514 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-lxml-4.6.5-3.el9.x86_64 17/152 2026-03-31T19:01:08.524 INFO:teuthology.orchestra.run.vm05.stdout: Installing : xmlsec1-1.2.29-13.el9.x86_64 18/152 2026-03-31T19:01:08.525 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libcephsqlite-2:20.2.0-721.g5bb32787.el9.x86_64 19/152 2026-03-31T19:01:08.550 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libcephsqlite-2:20.2.0-721.g5bb32787.el9.x86_64 19/152 2026-03-31T19:01:08.551 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libradosstriper1-2:20.2.0-721.g5bb32787.el9.x86_ 20/152 2026-03-31T19:01:08.567 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libradosstriper1-2:20.2.0-721.g5bb32787.el9.x86_ 20/152 2026-03-31T19:01:08.604 INFO:teuthology.orchestra.run.vm05.stdout: Installing : re2-1:20211101-20.el9.x86_64 21/152 2026-03-31T19:01:08.630 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 22/152 2026-03-31T19:01:08.641 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 23/152 2026-03-31T19:01:08.647 INFO:teuthology.orchestra.run.vm05.stdout: Installing : protobuf-3.14.0-17.el9.x86_64 24/152 2026-03-31T19:01:08.651 INFO:teuthology.orchestra.run.vm05.stdout: Installing : lua-5.4.4-4.el9.x86_64 25/152 2026-03-31T19:01:08.657 INFO:teuthology.orchestra.run.vm05.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 26/152 2026-03-31T19:01:08.685 INFO:teuthology.orchestra.run.vm05.stdout: Installing : unzip-6.0-59.el9.x86_64 27/152 2026-03-31T19:01:08.702 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 28/152 2026-03-31T19:01:08.705 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 29/152 2026-03-31T19:01:08.713 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 30/152 2026-03-31T19:01:08.716 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 31/152 2026-03-31T19:01:08.754 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 32/152 2026-03-31T19:01:08.761 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-ceph-common-2:20.2.0-721.g5bb32787.el9.x 33/152 2026-03-31T19:01:08.771 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-ceph-argparse-2:20.2.0-721.g5bb32787.el9 34/152 2026-03-31T19:01:08.785 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cephfs-2:20.2.0-721.g5bb32787.el9.x86_64 35/152 2026-03-31T19:01:08.799 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 36/152 2026-03-31T19:01:08.826 INFO:teuthology.orchestra.run.vm05.stdout: Installing : zip-3.0-35.el9.x86_64 37/152 2026-03-31T19:01:08.832 INFO:teuthology.orchestra.run.vm05.stdout: Installing : luarocks-3.9.2-5.el9.noarch 38/152 2026-03-31T19:01:08.840 INFO:teuthology.orchestra.run.vm05.stdout: Installing : lua-devel-5.4.4-4.el9.x86_64 39/152 2026-03-31T19:01:08.897 INFO:teuthology.orchestra.run.vm05.stdout: Installing : protobuf-compiler-3.14.0-17.el9.x86_64 40/152 2026-03-31T19:01:08.913 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 41/152 2026-03-31T19:01:08.916 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-rsa-4.9-2.el9.noarch 42/152 2026-03-31T19:01:08.923 INFO:teuthology.orchestra.run.vm05.stdout: Installing : xmlsec1-openssl-1.2.29-13.el9.x86_64 43/152 2026-03-31T19:01:08.940 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-xmlsec-1.3.13-1.el9.x86_64 44/152 2026-03-31T19:01:08.946 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-rbd-2:20.2.0-721.g5bb32787.el9.x86_64 45/152 2026-03-31T19:01:08.951 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 46/152 2026-03-31T19:01:08.960 INFO:teuthology.orchestra.run.vm05.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 47/152 2026-03-31T19:01:08.965 INFO:teuthology.orchestra.run.vm05.stdout: Installing : librados-devel-2:20.2.0-721.g5bb32787.el9.x86_64 48/152 2026-03-31T19:01:08.969 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 49/152 2026-03-31T19:01:08.986 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 50/152 2026-03-31T19:01:08.992 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 51/152 2026-03-31T19:01:08.998 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 52/152 2026-03-31T19:01:09.011 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 53/152 2026-03-31T19:01:09.022 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 54/152 2026-03-31T19:01:09.030 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 55/152 2026-03-31T19:01:09.052 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-msgpack-1.0.3-2.el9.x86_64 56/152 2026-03-31T19:01:09.066 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-influxdb-5.3.1-1.el9.noarch 57/152 2026-03-31T19:01:09.085 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-isodate-0.6.1-3.el9.noarch 58/152 2026-03-31T19:01:09.091 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-saml-1.16.0-1.el9.noarch 59/152 2026-03-31T19:01:09.100 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 60/152 2026-03-31T19:01:09.146 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 61/152 2026-03-31T19:01:09.518 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 62/152 2026-03-31T19:01:09.533 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 63/152 2026-03-31T19:01:09.538 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 64/152 2026-03-31T19:01:09.545 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 65/152 2026-03-31T19:01:09.550 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 66/152 2026-03-31T19:01:09.558 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 67/152 2026-03-31T19:01:09.561 INFO:teuthology.orchestra.run.vm05.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 68/152 2026-03-31T19:01:09.563 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 69/152 2026-03-31T19:01:09.593 INFO:teuthology.orchestra.run.vm05.stdout: Installing : grpc-data-1.46.7-10.el9.noarch 70/152 2026-03-31T19:01:09.644 INFO:teuthology.orchestra.run.vm05.stdout: Installing : abseil-cpp-20211102.0-4.el9.x86_64 71/152 2026-03-31T19:01:09.657 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-grpcio-1.46.7-10.el9.x86_64 72/152 2026-03-31T19:01:09.664 INFO:teuthology.orchestra.run.vm05.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 73/152 2026-03-31T19:01:09.669 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 74/152 2026-03-31T19:01:09.676 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 75/152 2026-03-31T19:01:09.681 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 76/152 2026-03-31T19:01:09.751 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 77/152 2026-03-31T19:01:09.773 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 78/152 2026-03-31T19:01:09.805 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 79/152 2026-03-31T19:01:09.817 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-protobuf-3.14.0-17.el9.noarch 80/152 2026-03-31T19:01:09.825 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-grpcio-tools-1.46.7-10.el9.x86_64 81/152 2026-03-31T19:01:09.833 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 82/152 2026-03-31T19:01:09.874 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 83/152 2026-03-31T19:01:10.135 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 84/152 2026-03-31T19:01:10.164 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 85/152 2026-03-31T19:01:10.169 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 86/152 2026-03-31T19:01:10.173 INFO:teuthology.orchestra.run.vm05.stdout: Installing : perl-Benchmark-1.23-483.el9.noarch 87/152 2026-03-31T19:01:10.233 INFO:teuthology.orchestra.run.vm05.stdout: Installing : openblas-0.3.29-1.el9.x86_64 88/152 2026-03-31T19:01:10.236 INFO:teuthology.orchestra.run.vm05.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 89/152 2026-03-31T19:01:10.259 INFO:teuthology.orchestra.run.vm05.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 90/152 2026-03-31T19:01:10.629 INFO:teuthology.orchestra.run.vm05.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 91/152 2026-03-31T19:01:10.716 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 92/152 2026-03-31T19:01:11.485 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 93/152 2026-03-31T19:01:11.508 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 94/152 2026-03-31T19:01:11.515 INFO:teuthology.orchestra.run.vm05.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 95/152 2026-03-31T19:01:11.830 INFO:teuthology.orchestra.run.vm05.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 96/152 2026-03-31T19:01:11.833 INFO:teuthology.orchestra.run.vm05.stdout: Installing : librgw2-2:20.2.0-721.g5bb32787.el9.x86_64 97/152 2026-03-31T19:01:11.857 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librgw2-2:20.2.0-721.g5bb32787.el9.x86_64 97/152 2026-03-31T19:01:11.860 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-rgw-2:20.2.0-721.g5bb32787.el9.x86_64 98/152 2026-03-31T19:01:13.227 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-common-2:20.2.0-721.g5bb32787.el9.x86_64 99/152 2026-03-31T19:01:13.234 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-common-2:20.2.0-721.g5bb32787.el9.x86_64 99/152 2026-03-31T19:01:13.269 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-common-2:20.2.0-721.g5bb32787.el9.x86_64 99/152 2026-03-31T19:01:13.301 INFO:teuthology.orchestra.run.vm05.stdout: Installing : smartmontools-1:7.2-10.el9.x86_64 100/152 2026-03-31T19:01:13.312 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: smartmontools-1:7.2-10.el9.x86_64 100/152 2026-03-31T19:01:13.312 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/smartd.service → /usr/lib/systemd/system/smartd.service. 2026-03-31T19:01:13.312 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:01:13.332 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-pyparsing-2.4.7-9.el9.noarch 101/152 2026-03-31T19:01:13.342 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-packaging-20.9-5.el9.noarch 102/152 2026-03-31T19:01:13.360 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-ply-3.11-14.el9.noarch 103/152 2026-03-31T19:01:13.385 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 104/152 2026-03-31T19:01:13.507 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 105/152 2026-03-31T19:01:13.523 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 106/152 2026-03-31T19:01:13.554 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 107/152 2026-03-31T19:01:13.672 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cheroot-10.0.1-5.el9.noarch 108/152 2026-03-31T19:01:13.745 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cherrypy-18.10.0-5.el9.noarch 109/152 2026-03-31T19:01:13.757 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 110/152 2026-03-31T19:01:13.764 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 111/152 2026-03-31T19:01:13.788 INFO:teuthology.orchestra.run.vm05.stdout: Installing : pciutils-3.7.0-7.el9.x86_64 112/152 2026-03-31T19:01:13.793 INFO:teuthology.orchestra.run.vm05.stdout: Installing : qatlib-25.08.0-2.el9.x86_64 113/152 2026-03-31T19:01:13.795 INFO:teuthology.orchestra.run.vm05.stdout: Installing : qatlib-service-25.08.0-2.el9.x86_64 114/152 2026-03-31T19:01:13.815 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: qatlib-service-25.08.0-2.el9.x86_64 114/152 2026-03-31T19:01:13.974 INFO:teuthology.orchestra.run.vm05.stdout: Installing : qatzip-libs-1.3.1-1.el9.x86_64 115/152 2026-03-31T19:01:13.980 INFO:teuthology.orchestra.run.vm05.stdout: Installing : nvme-cli-2.16-1.el9.x86_64 116/152 2026-03-31T19:01:14.291 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: nvme-cli-2.16-1.el9.x86_64 116/152 2026-03-31T19:01:14.291 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/default.target.wants/nvmefc-boot-connections.service → /usr/lib/systemd/system/nvmefc-boot-connections.service. 2026-03-31T19:01:14.291 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:01:14.764 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-base-2:20.2.0-721.g5bb32787.el9.x86_64 117/152 2026-03-31T19:01:14.862 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-base-2:20.2.0-721.g5bb32787.el9.x86_64 117/152 2026-03-31T19:01:14.862 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-31T19:01:14.862 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-31T19:01:14.862 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:01:14.918 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-selinux-2:20.2.0-721.g5bb32787.el9.x86_64 118/152 2026-03-31T19:01:21.091 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-selinux-2:20.2.0-721.g5bb32787.el9.x86_64 118/152 2026-03-31T19:01:21.091 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /sys 2026-03-31T19:01:21.091 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /proc 2026-03-31T19:01:21.091 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /mnt 2026-03-31T19:01:21.091 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /var/tmp 2026-03-31T19:01:21.091 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /home 2026-03-31T19:01:21.091 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /root 2026-03-31T19:01:21.091 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /tmp 2026-03-31T19:01:21.091 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:01:21.206 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mds-2:20.2.0-721.g5bb32787.el9.x86_64 119/152 2026-03-31T19:01:21.228 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mds-2:20.2.0-721.g5bb32787.el9.x86_64 119/152 2026-03-31T19:01:21.229 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-31T19:01:21.229 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-31T19:01:21.229 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-31T19:01:21.229 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-31T19:01:21.229 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:01:21.466 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mon-2:20.2.0-721.g5bb32787.el9.x86_64 120/152 2026-03-31T19:01:21.486 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mon-2:20.2.0-721.g5bb32787.el9.x86_64 120/152 2026-03-31T19:01:21.487 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-31T19:01:21.487 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-31T19:01:21.487 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-31T19:01:21.487 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-31T19:01:21.487 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:01:21.495 INFO:teuthology.orchestra.run.vm05.stdout: Installing : mailcap-2.1.49-5.el9.noarch 121/152 2026-03-31T19:01:21.498 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 122/152 2026-03-31T19:01:21.517 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 123/152 2026-03-31T19:01:21.517 INFO:teuthology.orchestra.run.vm05.stdout:Creating group 'qat' with GID 994. 2026-03-31T19:01:21.517 INFO:teuthology.orchestra.run.vm05.stdout:Creating group 'libstoragemgmt' with GID 993. 2026-03-31T19:01:21.517 INFO:teuthology.orchestra.run.vm05.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 993 and GID 993. 2026-03-31T19:01:21.517 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:01:21.526 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 123/152 2026-03-31T19:01:21.554 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 123/152 2026-03-31T19:01:21.554 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-31T19:01:21.554 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:01:21.573 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 124/152 2026-03-31T19:01:21.599 INFO:teuthology.orchestra.run.vm05.stdout: Installing : fuse-2.9.9-17.el9.x86_64 125/152 2026-03-31T19:01:21.669 INFO:teuthology.orchestra.run.vm05.stdout: Installing : cryptsetup-2.8.1-3.el9.x86_64 126/152 2026-03-31T19:01:21.673 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-volume-2:20.2.0-721.g5bb32787.el9.noarch 127/152 2026-03-31T19:01:21.688 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-volume-2:20.2.0-721.g5bb32787.el9.noarch 127/152 2026-03-31T19:01:21.688 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-31T19:01:21.688 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-volume@*.service" escaped as "ceph-volume@\x2a.service". 2026-03-31T19:01:21.688 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:01:22.447 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-osd-2:20.2.0-721.g5bb32787.el9.x86_64 128/152 2026-03-31T19:01:22.469 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-osd-2:20.2.0-721.g5bb32787.el9.x86_64 128/152 2026-03-31T19:01:22.469 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-31T19:01:22.469 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-31T19:01:22.469 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-31T19:01:22.469 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-31T19:01:22.469 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:01:22.535 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: cephadm-2:20.2.0-721.g5bb32787.el9.noarch 129/152 2026-03-31T19:01:22.538 INFO:teuthology.orchestra.run.vm05.stdout: Installing : cephadm-2:20.2.0-721.g5bb32787.el9.noarch 129/152 2026-03-31T19:01:22.545 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-prometheus-alerts-2:20.2.0-721.g5bb32787.el 130/152 2026-03-31T19:01:22.571 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-grafana-dashboards-2:20.2.0-721.g5bb32787.e 131/152 2026-03-31T19:01:22.574 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-cephadm-2:20.2.0-721.g5bb32787.el9.noar 132/152 2026-03-31T19:01:23.809 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-cephadm-2:20.2.0-721.g5bb32787.el9.noar 132/152 2026-03-31T19:01:23.819 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-dashboard-2:20.2.0-721.g5bb32787.el9.no 133/152 2026-03-31T19:01:24.388 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-dashboard-2:20.2.0-721.g5bb32787.el9.no 133/152 2026-03-31T19:01:24.394 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-diskprediction-local-2:20.2.0-721.g5bb3 134/152 2026-03-31T19:01:24.406 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:20.2.0-721.g5bb3 134/152 2026-03-31T19:01:24.408 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-k8sevents-2:20.2.0-721.g5bb32787.el9.no 135/152 2026-03-31T19:01:24.471 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-k8sevents-2:20.2.0-721.g5bb32787.el9.no 135/152 2026-03-31T19:01:24.523 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-modules-core-2:20.2.0-721.g5bb32787.el9 136/152 2026-03-31T19:01:24.526 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-2:20.2.0-721.g5bb32787.el9.x86_64 137/152 2026-03-31T19:01:24.547 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-2:20.2.0-721.g5bb32787.el9.x86_64 137/152 2026-03-31T19:01:24.547 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-31T19:01:24.547 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-31T19:01:24.547 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-31T19:01:24.547 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-31T19:01:24.547 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:01:24.561 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-rook-2:20.2.0-721.g5bb32787.el9.noarch 138/152 2026-03-31T19:01:24.573 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-rook-2:20.2.0-721.g5bb32787.el9.noarch 138/152 2026-03-31T19:01:24.620 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-2:20.2.0-721.g5bb32787.el9.x86_64 139/152 2026-03-31T19:01:25.885 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-fuse-2:20.2.0-721.g5bb32787.el9.x86_64 140/152 2026-03-31T19:01:25.890 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-radosgw-2:20.2.0-721.g5bb32787.el9.x86_64 141/152 2026-03-31T19:01:25.912 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-radosgw-2:20.2.0-721.g5bb32787.el9.x86_64 141/152 2026-03-31T19:01:25.912 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-31T19:01:25.912 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-31T19:01:25.912 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-31T19:01:25.912 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-31T19:01:25.912 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:01:25.924 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-immutable-object-cache-2:20.2.0-721.g5bb327 142/152 2026-03-31T19:01:25.944 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-immutable-object-cache-2:20.2.0-721.g5bb327 142/152 2026-03-31T19:01:25.944 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-31T19:01:25.944 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-31T19:01:25.944 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:01:26.094 INFO:teuthology.orchestra.run.vm05.stdout: Installing : rbd-mirror-2:20.2.0-721.g5bb32787.el9.x86_64 143/152 2026-03-31T19:01:26.115 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: rbd-mirror-2:20.2.0-721.g5bb32787.el9.x86_64 143/152 2026-03-31T19:01:26.115 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-31T19:01:26.115 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-31T19:01:26.115 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-31T19:01:26.115 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-31T19:01:26.115 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:01:30.491 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-test-2:20.2.0-721.g5bb32787.el9.x86_64 144/152 2026-03-31T19:01:30.497 INFO:teuthology.orchestra.run.vm05.stdout: Installing : perl-Test-Harness-1:3.42-461.el9.noarch 145/152 2026-03-31T19:01:30.505 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libcephfs-devel-2:20.2.0-721.g5bb32787.el9.x86_6 146/152 2026-03-31T19:01:30.517 INFO:teuthology.orchestra.run.vm05.stdout: Installing : rbd-fuse-2:20.2.0-721.g5bb32787.el9.x86_64 147/152 2026-03-31T19:01:30.537 INFO:teuthology.orchestra.run.vm05.stdout: Installing : rbd-nbd-2:20.2.0-721.g5bb32787.el9.x86_64 148/152 2026-03-31T19:01:30.544 INFO:teuthology.orchestra.run.vm05.stdout: Installing : s3cmd-2.4.0-1.el9.noarch 149/152 2026-03-31T19:01:30.547 INFO:teuthology.orchestra.run.vm05.stdout: Installing : bzip2-1.0.8-11.el9.x86_64 150/152 2026-03-31T19:01:30.547 INFO:teuthology.orchestra.run.vm05.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 151/152 2026-03-31T19:01:30.562 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 151/152 2026-03-31T19:01:30.562 INFO:teuthology.orchestra.run.vm05.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 152/152 2026-03-31T19:01:31.865 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 152/152 2026-03-31T19:01:31.865 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-2:20.2.0-721.g5bb32787.el9.x86_64 1/152 2026-03-31T19:01:31.865 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-base-2:20.2.0-721.g5bb32787.el9.x86_64 2/152 2026-03-31T19:01:31.865 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-common-2:20.2.0-721.g5bb32787.el9.x86_64 3/152 2026-03-31T19:01:31.865 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-fuse-2:20.2.0-721.g5bb32787.el9.x86_64 4/152 2026-03-31T19:01:31.865 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-immutable-object-cache-2:20.2.0-721.g5bb327 5/152 2026-03-31T19:01:31.865 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mds-2:20.2.0-721.g5bb32787.el9.x86_64 6/152 2026-03-31T19:01:31.865 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-2:20.2.0-721.g5bb32787.el9.x86_64 7/152 2026-03-31T19:01:31.865 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mon-2:20.2.0-721.g5bb32787.el9.x86_64 8/152 2026-03-31T19:01:31.865 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-osd-2:20.2.0-721.g5bb32787.el9.x86_64 9/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-radosgw-2:20.2.0-721.g5bb32787.el9.x86_64 10/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-selinux-2:20.2.0-721.g5bb32787.el9.x86_64 11/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-test-2:20.2.0-721.g5bb32787.el9.x86_64 12/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephfs-daemon-2:20.2.0-721.g5bb32787.el9.x86_ 13/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephfs-devel-2:20.2.0-721.g5bb32787.el9.x86_6 14/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephfs-proxy2-2:20.2.0-721.g5bb32787.el9.x86_ 15/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephfs2-2:20.2.0-721.g5bb32787.el9.x86_64 16/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephsqlite-2:20.2.0-721.g5bb32787.el9.x86_64 17/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librados-devel-2:20.2.0-721.g5bb32787.el9.x86_64 18/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libradosstriper1-2:20.2.0-721.g5bb32787.el9.x86_ 19/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librgw2-2:20.2.0-721.g5bb32787.el9.x86_64 20/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ceph-argparse-2:20.2.0-721.g5bb32787.el9 21/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ceph-common-2:20.2.0-721.g5bb32787.el9.x 22/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cephfs-2:20.2.0-721.g5bb32787.el9.x86_64 23/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rados-2:20.2.0-721.g5bb32787.el9.x86_64 24/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rbd-2:20.2.0-721.g5bb32787.el9.x86_64 25/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rgw-2:20.2.0-721.g5bb32787.el9.x86_64 26/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-fuse-2:20.2.0-721.g5bb32787.el9.x86_64 27/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-mirror-2:20.2.0-721.g5bb32787.el9.x86_64 28/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-nbd-2:20.2.0-721.g5bb32787.el9.x86_64 29/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-grafana-dashboards-2:20.2.0-721.g5bb32787.e 30/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-cephadm-2:20.2.0-721.g5bb32787.el9.noar 31/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-dashboard-2:20.2.0-721.g5bb32787.el9.no 32/152 2026-03-31T19:01:31.866 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-diskprediction-local-2:20.2.0-721.g5bb3 33/152 2026-03-31T19:01:31.868 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-k8sevents-2:20.2.0-721.g5bb32787.el9.no 34/152 2026-03-31T19:01:31.868 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-modules-core-2:20.2.0-721.g5bb32787.el9 35/152 2026-03-31T19:01:31.868 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-rook-2:20.2.0-721.g5bb32787.el9.noarch 36/152 2026-03-31T19:01:31.868 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-prometheus-alerts-2:20.2.0-721.g5bb32787.el 37/152 2026-03-31T19:01:31.868 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-volume-2:20.2.0-721.g5bb32787.el9.noarch 38/152 2026-03-31T19:01:31.868 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : cephadm-2:20.2.0-721.g5bb32787.el9.noarch 39/152 2026-03-31T19:01:31.868 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : bzip2-1.0.8-11.el9.x86_64 40/152 2026-03-31T19:01:31.868 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : cryptsetup-2.8.1-3.el9.x86_64 41/152 2026-03-31T19:01:31.868 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : fuse-2.9.9-17.el9.x86_64 42/152 2026-03-31T19:01:31.868 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 43/152 2026-03-31T19:01:31.868 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 44/152 2026-03-31T19:01:31.868 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 45/152 2026-03-31T19:01:31.868 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 46/152 2026-03-31T19:01:31.868 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 47/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : nvme-cli-2.16-1.el9.x86_64 48/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : pciutils-3.7.0-7.el9.x86_64 49/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 50/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 51/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ply-3.11-14.el9.noarch 52/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 53/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyparsing-2.4.7-9.el9.noarch 54/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 55/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 56/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : smartmontools-1:7.2-10.el9.x86_64 57/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : unzip-6.0-59.el9.x86_64 58/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : zip-3.0-35.el9.x86_64 59/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 60/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 61/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 62/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 63/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libnbd-1.20.3-4.el9.x86_64 64/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 65/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 66/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 67/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 68/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 69/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 70/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : lua-5.4.4-4.el9.x86_64 71/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 72/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 73/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : perl-Benchmark-1.23-483.el9.noarch 74/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : perl-Test-Harness-1:3.42-461.el9.noarch 75/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : protobuf-3.14.0-17.el9.x86_64 76/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 77/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 78/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 79/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 80/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 81/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-lxml-4.6.5-3.el9.x86_64 82/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 83/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 84/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 85/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-packaging-20.9-5.el9.noarch 86/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-protobuf-3.14.0-17.el9.noarch 87/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 88/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 89/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 90/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 91/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 92/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : qatlib-25.08.0-2.el9.x86_64 93/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : qatlib-service-25.08.0-2.el9.x86_64 94/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : qatzip-libs-1.3.1-1.el9.x86_64 95/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 96/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : xmlsec1-1.2.29-13.el9.x86_64 97/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : xmlsec1-openssl-1.2.29-13.el9.x86_64 98/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 99/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : lua-devel-5.4.4-4.el9.x86_64 100/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : protobuf-compiler-3.14.0-17.el9.x86_64 101/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : abseil-cpp-20211102.0-4.el9.x86_64 102/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 103/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : grpc-data-1.46.7-10.el9.noarch 104/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 105/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 106/152 2026-03-31T19:01:31.869 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 107/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 108/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : luarocks-3.9.2-5.el9.noarch 109/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 110/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 111/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 112/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 113/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 114/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 115/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 116/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cheroot-10.0.1-5.el9.noarch 117/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cherrypy-18.10.0-5.el9.noarch 118/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 119/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-grpcio-1.46.7-10.el9.x86_64 120/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-grpcio-tools-1.46.7-10.el9.x86_64 121/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-influxdb-5.3.1-1.el9.noarch 122/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-isodate-0.6.1-3.el9.noarch 123/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 124/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 125/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 126/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 127/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 128/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 129/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 130/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 131/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-msgpack-1.0.3-2.el9.x86_64 132/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 133/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 134/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 135/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 136/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 137/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 138/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-saml-1.16.0-1.el9.noarch 139/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 140/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 141/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 142/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-xmlsec-1.3.13-1.el9.x86_64 143/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 144/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 145/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : re2-1:20211101-20.el9.x86_64 146/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : s3cmd-2.4.0-1.el9.noarch 147/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 148/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librados2-2:20.2.0-721.g5bb32787.el9.x86_64 149/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 150/152 2026-03-31T19:01:31.870 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librbd1-2:20.2.0-721.g5bb32787.el9.x86_64 151/152 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 152/152 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout:Upgraded: 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: librados2-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: librbd1-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout:Installed: 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: abseil-cpp-20211102.0-4.el9.x86_64 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: bzip2-1.0.8-11.el9.x86_64 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: ceph-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: ceph-base-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: ceph-common-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: ceph-fuse-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: ceph-grafana-dashboards-2:20.2.0-721.g5bb32787.el9.noarch 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: ceph-immutable-object-cache-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mds-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-cephadm-2:20.2.0-721.g5bb32787.el9.noarch 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-dashboard-2:20.2.0-721.g5bb32787.el9.noarch 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-diskprediction-local-2:20.2.0-721.g5bb32787.el9.noarch 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-k8sevents-2:20.2.0-721.g5bb32787.el9.noarch 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core-2:20.2.0-721.g5bb32787.el9.noarch 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-rook-2:20.2.0-721.g5bb32787.el9.noarch 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mon-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: ceph-osd-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.971 INFO:teuthology.orchestra.run.vm05.stdout: ceph-prometheus-alerts-2:20.2.0-721.g5bb32787.el9.noarch 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: ceph-radosgw-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: ceph-selinux-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: ceph-test-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: ceph-volume-2:20.2.0-721.g5bb32787.el9.noarch 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: cephadm-2:20.2.0-721.g5bb32787.el9.noarch 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: cryptsetup-2.8.1-3.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: fuse-2.9.9-17.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: grpc-data-1.46.7-10.el9.noarch 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-daemon-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-devel-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-proxy2-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs2-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: libcephsqlite-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: libnbd-1.20.3-4.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: librados-devel-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: libradosstriper1-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: librgw2-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: lua-5.4.4-4.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: lua-devel-5.4.4-4.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: luarocks-3.9.2-5.el9.noarch 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: nvme-cli-2.16-1.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: pciutils-3.7.0-7.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: perl-Benchmark-1.23-483.el9.noarch 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: perl-Test-Harness-1:3.42-461.el9.noarch 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: protobuf-3.14.0-17.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: protobuf-compiler-3.14.0-17.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-common-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: python3-cephfs-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: python3-cheroot-10.0.1-5.el9.noarch 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy-18.10.0-5.el9.noarch 2026-03-31T19:01:31.972 INFO:teuthology.orchestra.run.vm05.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-grpcio-1.46.7-10.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-grpcio-tools-1.46.7-10.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-influxdb-5.3.1-1.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-isodate-0.6.1-3.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-lxml-4.6.5-3.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-msgpack-1.0.3-2.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-packaging-20.9-5.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-ply-3.11-14.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-protobuf-3.14.0-17.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyparsing-2.4.7-9.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-rados-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-rbd-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-rgw-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-saml-1.16.0-1.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-xmlsec-1.3.13-1.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: qatlib-25.08.0-2.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: qatlib-service-25.08.0-2.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: qatzip-libs-1.3.1-1.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: rbd-fuse-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: rbd-mirror-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: rbd-nbd-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: re2-1:20211101-20.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: s3cmd-2.4.0-1.el9.noarch 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: smartmontools-1:7.2-10.el9.x86_64 2026-03-31T19:01:31.973 INFO:teuthology.orchestra.run.vm05.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-31T19:01:31.974 INFO:teuthology.orchestra.run.vm05.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-31T19:01:31.974 INFO:teuthology.orchestra.run.vm05.stdout: unzip-6.0-59.el9.x86_64 2026-03-31T19:01:31.974 INFO:teuthology.orchestra.run.vm05.stdout: xmlsec1-1.2.29-13.el9.x86_64 2026-03-31T19:01:31.974 INFO:teuthology.orchestra.run.vm05.stdout: xmlsec1-openssl-1.2.29-13.el9.x86_64 2026-03-31T19:01:31.974 INFO:teuthology.orchestra.run.vm05.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-31T19:01:31.974 INFO:teuthology.orchestra.run.vm05.stdout: zip-3.0-35.el9.x86_64 2026-03-31T19:01:31.974 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:01:31.974 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:01:32.068 DEBUG:teuthology.parallel:result is None 2026-03-31T19:01:32.068 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=5bb3278730741031382ca9c3dc9d221a942e06a2 2026-03-31T19:01:32.668 DEBUG:teuthology.orchestra.run.vm05:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-31T19:01:32.688 INFO:teuthology.orchestra.run.vm05.stdout:20.2.0-721.g5bb32787.el9 2026-03-31T19:01:32.688 INFO:teuthology.packaging:The installed version of ceph is 20.2.0-721.g5bb32787.el9 2026-03-31T19:01:32.688 INFO:teuthology.task.install:The correct ceph version 20.2.0-721.g5bb32787 is installed. 2026-03-31T19:01:32.688 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-31T19:01:32.688 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-31T19:01:32.688 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-31T19:01:32.753 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-31T19:01:32.753 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-31T19:01:32.753 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/usr/bin/daemon-helper 2026-03-31T19:01:32.820 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-31T19:01:32.884 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-31T19:01:32.884 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-31T19:01:32.884 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-31T19:01:32.947 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-31T19:01:33.011 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-31T19:01:33.011 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-31T19:01:33.011 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/usr/bin/stdin-killer 2026-03-31T19:01:33.074 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-31T19:01:33.137 INFO:teuthology.run_tasks:Running task workunit... 2026-03-31T19:01:33.141 INFO:tasks.workunit:Pulling workunits from ref 0392f78529848ec72469e8e431875cb98d3a5fb4 2026-03-31T19:01:33.141 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-31T19:01:33.141 INFO:tasks.workunit:timeout=3h 2026-03-31T19:01:33.141 INFO:tasks.workunit:cleanup=True 2026-03-31T19:01:33.141 DEBUG:teuthology.orchestra.run.vm05:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-31T19:01:33.192 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-31T19:01:33.192 INFO:teuthology.orchestra.run.vm05.stderr:stat: cannot statx '/home/ubuntu/cephtest/mnt.0': No such file or directory 2026-03-31T19:01:33.192 DEBUG:teuthology.orchestra.run.vm05:> mkdir -- /home/ubuntu/cephtest/mnt.0 2026-03-31T19:01:33.247 INFO:tasks.workunit:Created dir /home/ubuntu/cephtest/mnt.0 2026-03-31T19:01:33.247 DEBUG:teuthology.orchestra.run.vm05:> cd -- /home/ubuntu/cephtest/mnt.0 && mkdir -- client.0 2026-03-31T19:01:33.302 DEBUG:teuthology.orchestra.run.vm05:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 0392f78529848ec72469e8e431875cb98d3a5fb4 2026-03-31T19:01:33.357 INFO:tasks.workunit.client.0.vm05.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-31T19:02:08.769 INFO:tasks.workunit.client.0.vm05.stderr:Note: switching to '0392f78529848ec72469e8e431875cb98d3a5fb4'. 2026-03-31T19:02:08.769 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-31T19:02:08.769 INFO:tasks.workunit.client.0.vm05.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-31T19:02:08.770 INFO:tasks.workunit.client.0.vm05.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-31T19:02:08.770 INFO:tasks.workunit.client.0.vm05.stderr:state without impacting any branches by switching back to a branch. 2026-03-31T19:02:08.770 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-31T19:02:08.770 INFO:tasks.workunit.client.0.vm05.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-31T19:02:08.770 INFO:tasks.workunit.client.0.vm05.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-31T19:02:08.770 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-31T19:02:08.770 INFO:tasks.workunit.client.0.vm05.stderr: git switch -c 2026-03-31T19:02:08.770 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-31T19:02:08.770 INFO:tasks.workunit.client.0.vm05.stderr:Or undo this operation with: 2026-03-31T19:02:08.770 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-31T19:02:08.770 INFO:tasks.workunit.client.0.vm05.stderr: git switch - 2026-03-31T19:02:08.770 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-31T19:02:08.770 INFO:tasks.workunit.client.0.vm05.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-31T19:02:08.770 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-31T19:02:08.770 INFO:tasks.workunit.client.0.vm05.stderr:HEAD is now at 0392f785298 qa/tasks/keystone: restart mariadb for rocky and alma linux too 2026-03-31T19:02:08.776 DEBUG:teuthology.orchestra.run.vm05:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/standalone && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-31T19:02:08.832 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-31T19:02:08.833 DEBUG:teuthology.orchestra.run.vm05:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-31T19:02:08.888 INFO:tasks.workunit:Running workunits matching crush on client.0... 2026-03-31T19:02:08.888 INFO:tasks.workunit:Running workunit crush/crush-choose-args.sh... 2026-03-31T19:02:08.888 DEBUG:teuthology.orchestra.run.vm05:workunit test crush/crush-choose-args.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=0392f78529848ec72469e8e431875cb98d3a5fb4 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh 2026-03-31T19:02:08.951 INFO:tasks.workunit.client.0.vm05.stderr:stty: 'standard input': Inappropriate ioctl for device 2026-03-31T19:02:08.954 INFO:tasks.workunit.client.0.vm05.stderr:+ PS4='${BASH_SOURCE[0]}:$LINENO: ${FUNCNAME[0]}: ' 2026-03-31T19:02:08.954 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2398: main: export PATH=.:/home/ubuntu/.local/bin:/home/ubuntu/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/sbin 2026-03-31T19:02:08.954 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2398: main: PATH=.:/home/ubuntu/.local/bin:/home/ubuntu/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/sbin 2026-03-31T19:02:08.954 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2399: main: export PYTHONWARNINGS=ignore 2026-03-31T19:02:08.954 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2399: main: PYTHONWARNINGS=ignore 2026-03-31T19:02:08.954 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2400: main: export CEPH_CONF=/dev/null 2026-03-31T19:02:08.954 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2400: main: CEPH_CONF=/dev/null 2026-03-31T19:02:08.954 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2401: main: unset CEPH_ARGS 2026-03-31T19:02:08.954 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2403: main: local code 2026-03-31T19:02:08.954 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2404: main: run td/crush-choose-args 2026-03-31T19:02:08.954 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:21: run: local dir=td/crush-choose-args 2026-03-31T19:02:08.954 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:22: run: shift 2026-03-31T19:02:08.954 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:24: run: export CEPH_MON=127.0.0.1:7131 2026-03-31T19:02:08.954 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:24: run: CEPH_MON=127.0.0.1:7131 2026-03-31T19:02:08.954 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:25: run: export CEPH_ARGS 2026-03-31T19:02:08.954 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:26: run: uuidgen 2026-03-31T19:02:08.955 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:26: run: CEPH_ARGS+='--fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none ' 2026-03-31T19:02:08.955 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:27: run: CEPH_ARGS+='--mon-host=127.0.0.1:7131 ' 2026-03-31T19:02:08.955 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:28: run: CEPH_ARGS+='--crush-location=root=default,host=HOST ' 2026-03-31T19:02:08.955 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:29: run: CEPH_ARGS+='--osd-crush-initial-weight=3 ' 2026-03-31T19:02:08.955 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:35: run: CEPH_ARGS+='--osd-class-update-on-start=false ' 2026-03-31T19:02:08.955 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:37: run: set 2026-03-31T19:02:08.956 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:37: run: sed -n -e 's/^\(TEST_[0-9a-z_]*\) .*/\1/p' 2026-03-31T19:02:08.957 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:37: run: local 'funcs=TEST_choose_args_update 2026-03-31T19:02:08.957 INFO:tasks.workunit.client.0.vm05.stderr:TEST_move_bucket 2026-03-31T19:02:08.957 INFO:tasks.workunit.client.0.vm05.stderr:TEST_no_update_weight_set 2026-03-31T19:02:08.957 INFO:tasks.workunit.client.0.vm05.stderr:TEST_reweight' 2026-03-31T19:02:08.957 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:38: run: for func in $funcs 2026-03-31T19:02:08.957 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:39: run: setup td/crush-choose-args 2026-03-31T19:02:08.957 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:122: setup: local dir=td/crush-choose-args 2026-03-31T19:02:08.957 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:123: setup: teardown td/crush-choose-args 2026-03-31T19:02:08.957 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:155: teardown: local dir=td/crush-choose-args 2026-03-31T19:02:08.957 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:156: teardown: local dumplogs= 2026-03-31T19:02:08.957 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:157: teardown: kill_daemons td/crush-choose-args KILL 2026-03-31T19:02:08.957 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:02:08.957 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:02:08.958 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:02:08.958 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:02:08.958 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:02:08.959 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:02:08.960 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: uname 2026-03-31T19:02:08.960 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-31T19:02:08.960 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: stat -f -c %T . 2026-03-31T19:02:08.961 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: '[' xfs == btrfs ']' 2026-03-31T19:02:08.961 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:162: teardown: local cores=no 2026-03-31T19:02:08.961 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: sysctl -n kernel.core_pattern 2026-03-31T19:02:08.962 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:08.962 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: '[' / = '|' ']' 2026-03-31T19:02:08.963 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: grep -q '^core\|core$' 2026-03-31T19:02:08.963 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:08.963 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:02:08.964 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: '[' no = yes -o '' = 1 ']' 2026-03-31T19:02:08.964 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: rm -fr td/crush-choose-args 2026-03-31T19:02:08.965 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: get_asok_dir 2026-03-31T19:02:08.965 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:08.965 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:08.965 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: rm -rf /tmp/ceph-asok.51199 2026-03-31T19:02:08.966 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:191: teardown: '[' no = yes ']' 2026-03-31T19:02:08.966 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: return 0 2026-03-31T19:02:08.966 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:124: setup: mkdir -p td/crush-choose-args 2026-03-31T19:02:08.967 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:125: setup: get_asok_dir 2026-03-31T19:02:08.967 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:08.967 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:08.967 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:125: setup: mkdir -p /tmp/ceph-asok.51199 2026-03-31T19:02:08.968 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:126: setup: ulimit -n 2026-03-31T19:02:08.968 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:126: setup: '[' 1024 -le 1024 ']' 2026-03-31T19:02:08.968 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:127: setup: ulimit -n 4096 2026-03-31T19:02:08.968 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:129: setup: '[' -z '' ']' 2026-03-31T19:02:08.968 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:130: setup: trap 'teardown td/crush-choose-args 1' TERM HUP INT 2026-03-31T19:02:08.968 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:40: run: TEST_choose_args_update td/crush-choose-args 2026-03-31T19:02:08.968 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:49: TEST_choose_args_update: local dir=td/crush-choose-args 2026-03-31T19:02:08.968 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:51: TEST_choose_args_update: run_mon td/crush-choose-args a 2026-03-31T19:02:08.968 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:439: run_mon: local dir=td/crush-choose-args 2026-03-31T19:02:08.968 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:440: run_mon: shift 2026-03-31T19:02:08.968 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:441: run_mon: local id=a 2026-03-31T19:02:08.968 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:442: run_mon: shift 2026-03-31T19:02:08.969 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:443: run_mon: local data=td/crush-choose-args/a 2026-03-31T19:02:08.969 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:446: run_mon: ceph-mon --id a --mkfs --mon-data=td/crush-choose-args/a --run-dir=td/crush-choose-args 2026-03-31T19:02:09.001 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:453: run_mon: get_asok_path 2026-03-31T19:02:09.001 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:09.001 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:09.001 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:09.001 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:09.001 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:09.002 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:09.002 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:453: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/crush-choose-args/a '--log-file=td/crush-choose-args/$name.log' '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --mon-cluster-log-file=td/crush-choose-args/log --run-dir=td/crush-choose-args '--pid-file=td/crush-choose-args/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-31T19:02:09.039 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: cat 2026-03-31T19:02:09.039 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: get_config mon a fsid 2026-03-31T19:02:09.039 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1119: get_config: local daemon=mon 2026-03-31T19:02:09.039 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1120: get_config: local id=a 2026-03-31T19:02:09.039 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1121: get_config: local config=fsid 2026-03-31T19:02:09.040 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: get_asok_path mon.a 2026-03-31T19:02:09.040 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name=mon.a 2026-03-31T19:02:09.040 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n mon.a ']' 2026-03-31T19:02:09.040 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: get_asok_dir 2026-03-31T19:02:09.040 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:09.040 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:09.040 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: echo /tmp/ceph-asok.51199/ceph-mon.a.asok 2026-03-31T19:02:09.040 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: local daemon_asok=/tmp/ceph-asok.51199/ceph-mon.a.asok 2026-03-31T19:02:09.040 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: CEPH_ARGS= 2026-03-31T19:02:09.040 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: ceph --format json daemon /tmp/ceph-asok.51199/ceph-mon.a.asok config get fsid 2026-03-31T19:02:09.041 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: jq -r .fsid 2026-03-31T19:02:09.093 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: get_config mon a mon_host 2026-03-31T19:02:09.093 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1119: get_config: local daemon=mon 2026-03-31T19:02:09.093 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1120: get_config: local id=a 2026-03-31T19:02:09.093 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1121: get_config: local config=mon_host 2026-03-31T19:02:09.093 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: get_asok_path mon.a 2026-03-31T19:02:09.093 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name=mon.a 2026-03-31T19:02:09.093 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n mon.a ']' 2026-03-31T19:02:09.094 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: get_asok_dir 2026-03-31T19:02:09.094 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:09.094 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:09.094 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: echo /tmp/ceph-asok.51199/ceph-mon.a.asok 2026-03-31T19:02:09.094 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: local daemon_asok=/tmp/ceph-asok.51199/ceph-mon.a.asok 2026-03-31T19:02:09.094 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: CEPH_ARGS= 2026-03-31T19:02:09.094 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: ceph --format json daemon /tmp/ceph-asok.51199/ceph-mon.a.asok config get mon_host 2026-03-31T19:02:09.094 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: jq -r .mon_host 2026-03-31T19:02:09.147 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:52: TEST_choose_args_update: run_mgr td/crush-choose-args x 2026-03-31T19:02:09.147 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:545: run_mgr: local dir=td/crush-choose-args 2026-03-31T19:02:09.147 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: run_mgr: shift 2026-03-31T19:02:09.147 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: run_mgr: local id=x 2026-03-31T19:02:09.147 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:548: run_mgr: shift 2026-03-31T19:02:09.147 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:549: run_mgr: local data=td/crush-choose-args/x 2026-03-31T19:02:09.147 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:551: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-31T19:02:09.258 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: get_asok_path 2026-03-31T19:02:09.258 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:09.258 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:09.258 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:09.258 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:09.258 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:09.258 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:09.259 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-31T19:02:09.259 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/crush-choose-args/x '--log-file=td/crush-choose-args/$name.log' '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --run-dir=td/crush-choose-args '--pid-file=td/crush-choose-args/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-31T19:02:09.281 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:53: TEST_choose_args_update: run_osd td/crush-choose-args 0 2026-03-31T19:02:09.281 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-choose-args 2026-03-31T19:02:09.281 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:02:09.281 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=0 2026-03-31T19:02:09.281 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:02:09.281 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-choose-args/0 2026-03-31T19:02:09.281 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false ' 2026-03-31T19:02:09.281 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:02:09.282 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:02:09.282 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:02:09.282 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-choose-args/0' 2026-03-31T19:02:09.282 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-choose-args/0/journal' 2026-03-31T19:02:09.282 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:02:09.282 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:02:09.282 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-choose-args' 2026-03-31T19:02:09.282 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:02:09.282 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:09.282 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:09.282 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:09.282 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:09.282 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:09.282 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:09.283 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:09.283 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:02:09.283 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:02:09.283 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:02:09.283 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-choose-args/$name.log' 2026-03-31T19:02:09.283 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-choose-args/$name.pid' 2026-03-31T19:02:09.283 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:02:09.283 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:02:09.283 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:02:09.283 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:02:09.283 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:02:09.283 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:02:09.283 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-choose-args/0 2026-03-31T19:02:09.284 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:02:09.284 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=e5bcb1ee-0938-403e-9704-8afb11bd1d98 2026-03-31T19:02:09.284 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd0 e5bcb1ee-0938-403e-9704-8afb11bd1d98' 2026-03-31T19:02:09.284 INFO:tasks.workunit.client.0.vm05.stdout:add osd0 e5bcb1ee-0938-403e-9704-8afb11bd1d98 2026-03-31T19:02:09.284 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:02:09.299 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQAxGsxprb+/ERAA0C4OWKDmdz4S792npEPp5Q== 2026-03-31T19:02:09.299 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQAxGsxprb+/ERAA0C4OWKDmdz4S792npEPp5Q=="}' 2026-03-31T19:02:09.299 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new e5bcb1ee-0938-403e-9704-8afb11bd1d98 -i td/crush-choose-args/0/new.json 2026-03-31T19:02:09.427 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:02:09.435 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-choose-args/0/new.json 2026-03-31T19:02:09.436 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 0 --fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/0 --osd-journal=td/crush-choose-args/0/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAxGsxprb+/ERAA0C4OWKDmdz4S792npEPp5Q== --osd-uuid e5bcb1ee-0938-403e-9704-8afb11bd1d98 2026-03-31T19:02:09.457 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:09.456+0000 7f870ec6c900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:09.459 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:09.458+0000 7f870ec6c900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:09.460 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:09.459+0000 7f870ec6c900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:09.460 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:09.459+0000 7f870ec6c900 -1 bdev(0x56294a4b0c00 td/crush-choose-args/0/block) open stat got: (1) Operation not permitted 2026-03-31T19:02:09.460 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:09.459+0000 7f870ec6c900 -1 bluestore(td/crush-choose-args/0) _read_fsid unparsable uuid 2026-03-31T19:02:09.948 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-choose-args/0/keyring 2026-03-31T19:02:09.948 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:02:09.949 INFO:tasks.workunit.client.0.vm05.stdout:adding osd0 key to auth repository 2026-03-31T19:02:09.949 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd0 key to auth repository 2026-03-31T19:02:09.949 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-choose-args/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:02:10.063 INFO:tasks.workunit.client.0.vm05.stdout:start osd.0 2026-03-31T19:02:10.064 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.0 2026-03-31T19:02:10.064 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 0 --fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/0 --osd-journal=td/crush-choose-args/0/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:02:10.064 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:02:10.064 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:02:10.064 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:02:10.082 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:10.081+0000 7fc49e8a9900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:10.084 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:10.083+0000 7fc49e8a9900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:10.085 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:10.084+0000 7fc49e8a9900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:10.192 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 0 2026-03-31T19:02:10.192 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:02:10.192 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=0 2026-03-31T19:02:10.192 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:02:10.192 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:02:10.192 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:10.192 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:02:10.193 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:02:10.193 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:10.193 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.0 up' 2026-03-31T19:02:10.222 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:10.220+0000 7fc49e8a9900 -1 Falling back to public interface 2026-03-31T19:02:10.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:02:10.384 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:10.382+0000 7fc49e8a9900 -1 osd.0 0 log_to_monitors true 2026-03-31T19:02:11.318 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:02:11.318 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:11.318 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:02:11.319 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:02:11.319 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:11.319 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.0 up' 2026-03-31T19:02:11.430 INFO:tasks.workunit.client.0.vm05.stdout:osd.0 up in weight 1 up_from 4 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6800/2626481719,v1:127.0.0.1:6801/2626481719] [v2:127.0.0.1:6802/2626481719,v1:127.0.0.1:6803/2626481719] exists,up e5bcb1ee-0938-403e-9704-8afb11bd1d98 2026-03-31T19:02:11.430 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:02:11.430 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:02:11.430 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:02:11.430 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:55: TEST_choose_args_update: ceph osd set-require-min-compat-client luminous 2026-03-31T19:02:11.660 INFO:tasks.workunit.client.0.vm05.stderr:set require_min_compat_client to luminous 2026-03-31T19:02:11.670 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:56: TEST_choose_args_update: ceph osd getcrushmap 2026-03-31T19:02:11.779 INFO:tasks.workunit.client.0.vm05.stderr:2 2026-03-31T19:02:11.786 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:57: TEST_choose_args_update: crushtool -d td/crush-choose-args/map -o td/crush-choose-args/map.txt 2026-03-31T19:02:11.801 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:58: TEST_choose_args_update: sed -i -e '/end crush map/d' td/crush-choose-args/map.txt 2026-03-31T19:02:11.802 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:59: TEST_choose_args_update: cat 2026-03-31T19:02:11.803 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:82: TEST_choose_args_update: crushtool -c td/crush-choose-args/map.txt -o td/crush-choose-args/map-new 2026-03-31T19:02:11.818 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:83: TEST_choose_args_update: ceph osd setcrushmap -i td/crush-choose-args/map-new 2026-03-31T19:02:12.163 INFO:tasks.workunit.client.0.vm05.stderr:4 2026-03-31T19:02:12.174 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:84: TEST_choose_args_update: ceph osd crush tree 2026-03-31T19:02:12.425 INFO:tasks.workunit.client.0.vm05.stdout:ID CLASS WEIGHT 0 TYPE NAME 2026-03-31T19:02:12.425 INFO:tasks.workunit.client.0.vm05.stdout:-1 3.00000 root default 2026-03-31T19:02:12.425 INFO:tasks.workunit.client.0.vm05.stdout:-2 3.00000 2.00000 host HOST 2026-03-31T19:02:12.425 INFO:tasks.workunit.client.0.vm05.stdout: 0 3.00000 2.00000 osd.0 2026-03-31T19:02:12.434 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:86: TEST_choose_args_update: run_osd td/crush-choose-args 1 2026-03-31T19:02:12.434 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-choose-args 2026-03-31T19:02:12.434 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:02:12.434 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=1 2026-03-31T19:02:12.434 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:02:12.434 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-choose-args/1 2026-03-31T19:02:12.434 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false ' 2026-03-31T19:02:12.434 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:02:12.434 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:02:12.434 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:02:12.434 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-choose-args/1' 2026-03-31T19:02:12.434 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-choose-args/1/journal' 2026-03-31T19:02:12.434 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:02:12.434 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:02:12.434 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-choose-args' 2026-03-31T19:02:12.435 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:02:12.435 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:12.435 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:12.435 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:12.435 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:12.435 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:12.435 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:12.435 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:12.435 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:02:12.435 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:02:12.435 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:02:12.435 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-choose-args/$name.log' 2026-03-31T19:02:12.435 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-choose-args/$name.pid' 2026-03-31T19:02:12.435 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:02:12.435 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:02:12.435 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:02:12.435 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:02:12.435 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:02:12.435 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:02:12.435 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-choose-args/1 2026-03-31T19:02:12.436 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:02:12.437 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=18f28975-f621-41dc-b749-f96cffceef1a 2026-03-31T19:02:12.437 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd1 18f28975-f621-41dc-b749-f96cffceef1a' 2026-03-31T19:02:12.437 INFO:tasks.workunit.client.0.vm05.stdout:add osd1 18f28975-f621-41dc-b749-f96cffceef1a 2026-03-31T19:02:12.437 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:02:12.450 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQA0Gsxpc+nMGhAAgP+1E+TInuPcEa8g22dWTg== 2026-03-31T19:02:12.450 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQA0Gsxpc+nMGhAAgP+1E+TInuPcEa8g22dWTg=="}' 2026-03-31T19:02:12.450 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new 18f28975-f621-41dc-b749-f96cffceef1a -i td/crush-choose-args/1/new.json 2026-03-31T19:02:12.696 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:02:12.706 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-choose-args/1/new.json 2026-03-31T19:02:12.707 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 1 --fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/1 --osd-journal=td/crush-choose-args/1/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQA0Gsxpc+nMGhAAgP+1E+TInuPcEa8g22dWTg== --osd-uuid 18f28975-f621-41dc-b749-f96cffceef1a 2026-03-31T19:02:12.727 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:12.725+0000 7f300ee10900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:12.728 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:12.727+0000 7f300ee10900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:12.729 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:12.728+0000 7f300ee10900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:12.729 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:12.728+0000 7f300ee10900 -1 bdev(0x557502248c00 td/crush-choose-args/1/block) open stat got: (1) Operation not permitted 2026-03-31T19:02:12.729 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:12.728+0000 7f300ee10900 -1 bluestore(td/crush-choose-args/1) _read_fsid unparsable uuid 2026-03-31T19:02:13.132 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-choose-args/1/keyring 2026-03-31T19:02:13.132 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:02:13.133 INFO:tasks.workunit.client.0.vm05.stdout:adding osd1 key to auth repository 2026-03-31T19:02:13.133 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd1 key to auth repository 2026-03-31T19:02:13.133 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-choose-args/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:02:13.475 INFO:tasks.workunit.client.0.vm05.stdout:start osd.1 2026-03-31T19:02:13.475 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.1 2026-03-31T19:02:13.475 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 1 --fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/1 --osd-journal=td/crush-choose-args/1/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:02:13.475 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:02:13.475 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:02:13.475 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:02:13.494 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:13.493+0000 7fdac4611900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:13.496 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:13.495+0000 7fdac4611900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:13.497 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:13.496+0000 7fdac4611900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:13.650 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:13.648+0000 7fdac4611900 -1 Falling back to public interface 2026-03-31T19:02:13.726 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 1 2026-03-31T19:02:13.726 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:02:13.726 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=1 2026-03-31T19:02:13.726 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:02:13.726 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:02:13.726 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:13.726 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:02:13.726 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:02:13.726 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:13.726 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.1 up' 2026-03-31T19:02:13.793 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:13.791+0000 7fdac4611900 -1 osd.1 0 log_to_monitors true 2026-03-31T19:02:13.978 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:02:14.980 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:02:14.980 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:14.980 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:02:14.980 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:02:14.980 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:14.980 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.1 up' 2026-03-31T19:02:15.228 INFO:tasks.workunit.client.0.vm05.stdout:osd.1 up in weight 1 up_from 11 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/3773461071,v1:127.0.0.1:6811/3773461071] [v2:127.0.0.1:6812/3773461071,v1:127.0.0.1:6813/3773461071] exists,up 18f28975-f621-41dc-b749-f96cffceef1a 2026-03-31T19:02:15.228 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:02:15.228 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:02:15.228 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:02:15.228 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:87: TEST_choose_args_update: ceph osd crush tree 2026-03-31T19:02:15.473 INFO:tasks.workunit.client.0.vm05.stdout:ID CLASS WEIGHT 0 TYPE NAME 2026-03-31T19:02:15.473 INFO:tasks.workunit.client.0.vm05.stdout:-1 6.00000 root default 2026-03-31T19:02:15.473 INFO:tasks.workunit.client.0.vm05.stdout:-2 6.00000 5.00000 host HOST 2026-03-31T19:02:15.473 INFO:tasks.workunit.client.0.vm05.stdout: 0 3.00000 2.00000 osd.0 2026-03-31T19:02:15.473 INFO:tasks.workunit.client.0.vm05.stdout: 1 3.00000 3.00000 osd.1 2026-03-31T19:02:15.482 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:88: TEST_choose_args_update: ceph osd getcrushmap 2026-03-31T19:02:15.721 INFO:tasks.workunit.client.0.vm05.stderr:5 2026-03-31T19:02:15.731 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:89: TEST_choose_args_update: crushtool -d td/crush-choose-args/map-one-more -o td/crush-choose-args/map-one-more.txt 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:90: TEST_choose_args_update: cat td/crush-choose-args/map-one-more.txt 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:# begin crush map 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:tunable choose_local_tries 0 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:tunable choose_local_fallback_tries 0 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:tunable choose_total_tries 50 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:tunable chooseleaf_descend_once 1 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:tunable chooseleaf_vary_r 1 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:tunable chooseleaf_stable 1 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:tunable straw_calc_version 1 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:tunable allowed_bucket_algs 54 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout: 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:# devices 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:device 0 osd.0 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:device 1 osd.1 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout: 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:# types 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:type 0 osd 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:type 1 host 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:type 2 chassis 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:type 3 rack 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:type 4 row 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:type 5 pdu 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:type 6 pod 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:type 7 room 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:type 8 datacenter 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:type 9 zone 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:type 10 region 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:type 11 root 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout: 2026-03-31T19:02:15.744 INFO:tasks.workunit.client.0.vm05.stdout:# buckets 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout:host HOST { 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: id -2 # do not change unnecessarily 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: # weight 6.00000 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: alg straw2 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: hash 0 # rjenkins1 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: item osd.0 weight 3.00000 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: item osd.1 weight 3.00000 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout:} 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout:root default { 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: id -1 # do not change unnecessarily 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: # weight 6.00000 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: alg straw2 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: hash 0 # rjenkins1 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: item HOST weight 6.00000 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout:} 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout:# rules 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout:rule replicated_rule { 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: id 0 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: type replicated 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: step take default 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: step choose firstn 0 type osd 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: step emit 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout:} 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout:# choose_args 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout:choose_args 0 { 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: { 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: bucket_id -1 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: weight_set [ 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: [ 5.00000 ] 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: [ 5.00000 ] 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: ] 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: ids [ -10 ] 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: } 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: { 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: bucket_id -2 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: weight_set [ 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: [ 2.00000 3.00000 ] 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: [ 2.00000 3.00000 ] 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: ] 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: ids [ -20 1 ] 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: } 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout:} 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout: 2026-03-31T19:02:15.745 INFO:tasks.workunit.client.0.vm05.stdout:# end crush map 2026-03-31T19:02:15.746 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:91: TEST_choose_args_update: diff -u td/crush-choose-args/map-one-more.txt /home/ubuntu/cephtest/clone.client.0/src/test/crush/crush-choose-args-expected-one-more-3.txt 2026-03-31T19:02:15.746 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:93: TEST_choose_args_update: destroy_osd td/crush-choose-args 1 2026-03-31T19:02:15.746 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:776: destroy_osd: local dir=td/crush-choose-args 2026-03-31T19:02:15.746 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:777: destroy_osd: local id=1 2026-03-31T19:02:15.746 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:779: destroy_osd: ceph osd out osd.1 2026-03-31T19:02:16.040 INFO:tasks.workunit.client.0.vm05.stderr:osd.1 is already out. 2026-03-31T19:02:16.050 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:780: destroy_osd: kill_daemons td/crush-choose-args TERM osd.1 2026-03-31T19:02:16.050 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:02:16.050 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:02:16.050 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:02:16.050 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:02:16.050 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:02:16.156 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:02:16.156 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:781: destroy_osd: ceph osd down osd.1 2026-03-31T19:02:16.398 INFO:tasks.workunit.client.0.vm05.stderr:osd.1 is already down. 2026-03-31T19:02:16.408 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:782: destroy_osd: ceph osd purge osd.1 --yes-i-really-mean-it 2026-03-31T19:02:16.647 INFO:tasks.workunit.client.0.vm05.stderr:osd.1 does not exist 2026-03-31T19:02:16.658 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:783: destroy_osd: teardown td/crush-choose-args/1 2026-03-31T19:02:16.658 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:155: teardown: local dir=td/crush-choose-args/1 2026-03-31T19:02:16.658 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:156: teardown: local dumplogs= 2026-03-31T19:02:16.658 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:157: teardown: kill_daemons td/crush-choose-args/1 KILL 2026-03-31T19:02:16.658 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:02:16.658 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:02:16.658 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:02:16.658 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:02:16.658 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:02:16.660 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:02:16.660 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: uname 2026-03-31T19:02:16.660 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-31T19:02:16.661 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: stat -f -c %T . 2026-03-31T19:02:16.662 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: '[' xfs == btrfs ']' 2026-03-31T19:02:16.662 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:162: teardown: local cores=no 2026-03-31T19:02:16.662 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: sysctl -n kernel.core_pattern 2026-03-31T19:02:16.663 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:16.663 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: '[' / = '|' ']' 2026-03-31T19:02:16.663 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: grep -q '^core\|core$' 2026-03-31T19:02:16.663 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:16.663 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:02:16.664 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: '[' no = yes -o '' = 1 ']' 2026-03-31T19:02:16.664 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: rm -fr td/crush-choose-args/1 2026-03-31T19:02:16.668 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: get_asok_dir 2026-03-31T19:02:16.668 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:16.668 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:16.668 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: rm -rf /tmp/ceph-asok.51199 2026-03-31T19:02:16.669 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:191: teardown: '[' no = yes ']' 2026-03-31T19:02:16.669 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: return 0 2026-03-31T19:02:16.669 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:784: destroy_osd: rm -fr td/crush-choose-args/1 2026-03-31T19:02:16.670 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:94: TEST_choose_args_update: ceph osd crush tree 2026-03-31T19:02:16.911 INFO:tasks.workunit.client.0.vm05.stdout:ID CLASS WEIGHT 0 TYPE NAME 2026-03-31T19:02:16.912 INFO:tasks.workunit.client.0.vm05.stdout:-1 3.00000 root default 2026-03-31T19:02:16.912 INFO:tasks.workunit.client.0.vm05.stdout:-2 3.00000 2.00000 host HOST 2026-03-31T19:02:16.912 INFO:tasks.workunit.client.0.vm05.stdout: 0 3.00000 2.00000 osd.0 2026-03-31T19:02:16.922 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:95: TEST_choose_args_update: ceph osd getcrushmap 2026-03-31T19:02:17.161 INFO:tasks.workunit.client.0.vm05.stderr:6 2026-03-31T19:02:17.171 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:96: TEST_choose_args_update: crushtool -d td/crush-choose-args/map-one-less -o td/crush-choose-args/map-one-less.txt 2026-03-31T19:02:17.184 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:97: TEST_choose_args_update: diff -u td/crush-choose-args/map-one-less.txt td/crush-choose-args/map.txt 2026-03-31T19:02:17.184 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:41: run: teardown td/crush-choose-args 2026-03-31T19:02:17.184 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:155: teardown: local dir=td/crush-choose-args 2026-03-31T19:02:17.184 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:156: teardown: local dumplogs= 2026-03-31T19:02:17.184 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:157: teardown: kill_daemons td/crush-choose-args KILL 2026-03-31T19:02:17.184 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:02:17.184 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:02:17.185 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:02:17.185 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:02:17.185 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:02:17.292 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:02:17.292 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: uname 2026-03-31T19:02:17.293 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-31T19:02:17.293 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: stat -f -c %T . 2026-03-31T19:02:17.294 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: '[' xfs == btrfs ']' 2026-03-31T19:02:17.294 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:162: teardown: local cores=no 2026-03-31T19:02:17.294 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: sysctl -n kernel.core_pattern 2026-03-31T19:02:17.295 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:17.295 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: '[' / = '|' ']' 2026-03-31T19:02:17.295 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: grep -q '^core\|core$' 2026-03-31T19:02:17.296 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:17.296 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:02:17.297 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: '[' no = yes -o '' = 1 ']' 2026-03-31T19:02:17.297 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: rm -fr td/crush-choose-args 2026-03-31T19:02:17.303 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: get_asok_dir 2026-03-31T19:02:17.303 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:17.303 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:17.303 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: rm -rf /tmp/ceph-asok.51199 2026-03-31T19:02:17.304 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:191: teardown: '[' no = yes ']' 2026-03-31T19:02:17.304 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: return 0 2026-03-31T19:02:17.304 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:38: run: for func in $funcs 2026-03-31T19:02:17.304 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:39: run: setup td/crush-choose-args 2026-03-31T19:02:17.304 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:122: setup: local dir=td/crush-choose-args 2026-03-31T19:02:17.304 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:123: setup: teardown td/crush-choose-args 2026-03-31T19:02:17.304 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:155: teardown: local dir=td/crush-choose-args 2026-03-31T19:02:17.304 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:156: teardown: local dumplogs= 2026-03-31T19:02:17.304 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:157: teardown: kill_daemons td/crush-choose-args KILL 2026-03-31T19:02:17.304 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:02:17.304 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:02:17.304 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:02:17.304 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:02:17.304 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:02:17.306 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:02:17.307 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: uname 2026-03-31T19:02:17.307 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-31T19:02:17.308 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: stat -f -c %T . 2026-03-31T19:02:17.308 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: '[' xfs == btrfs ']' 2026-03-31T19:02:17.308 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:162: teardown: local cores=no 2026-03-31T19:02:17.309 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: sysctl -n kernel.core_pattern 2026-03-31T19:02:17.309 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:17.309 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: '[' / = '|' ']' 2026-03-31T19:02:17.310 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: grep -q '^core\|core$' 2026-03-31T19:02:17.310 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:17.311 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:02:17.312 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: '[' no = yes -o '' = 1 ']' 2026-03-31T19:02:17.312 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: rm -fr td/crush-choose-args 2026-03-31T19:02:17.313 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: get_asok_dir 2026-03-31T19:02:17.313 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:17.313 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:17.313 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: rm -rf /tmp/ceph-asok.51199 2026-03-31T19:02:17.314 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:191: teardown: '[' no = yes ']' 2026-03-31T19:02:17.314 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: return 0 2026-03-31T19:02:17.314 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:124: setup: mkdir -p td/crush-choose-args 2026-03-31T19:02:17.315 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:125: setup: get_asok_dir 2026-03-31T19:02:17.315 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:17.315 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:17.315 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:125: setup: mkdir -p /tmp/ceph-asok.51199 2026-03-31T19:02:17.317 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:126: setup: ulimit -n 2026-03-31T19:02:17.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:126: setup: '[' 4096 -le 1024 ']' 2026-03-31T19:02:17.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:129: setup: '[' -z '' ']' 2026-03-31T19:02:17.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:130: setup: trap 'teardown td/crush-choose-args 1' TERM HUP INT 2026-03-31T19:02:17.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:40: run: TEST_move_bucket td/crush-choose-args 2026-03-31T19:02:17.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:195: TEST_move_bucket: local dir=td/crush-choose-args 2026-03-31T19:02:17.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:197: TEST_move_bucket: run_mon td/crush-choose-args a 2026-03-31T19:02:17.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:439: run_mon: local dir=td/crush-choose-args 2026-03-31T19:02:17.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:440: run_mon: shift 2026-03-31T19:02:17.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:441: run_mon: local id=a 2026-03-31T19:02:17.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:442: run_mon: shift 2026-03-31T19:02:17.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:443: run_mon: local data=td/crush-choose-args/a 2026-03-31T19:02:17.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:446: run_mon: ceph-mon --id a --mkfs --mon-data=td/crush-choose-args/a --run-dir=td/crush-choose-args 2026-03-31T19:02:17.348 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:453: run_mon: get_asok_path 2026-03-31T19:02:17.348 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:17.348 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:17.348 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:17.348 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:17.348 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:17.348 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:17.349 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:453: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/crush-choose-args/a '--log-file=td/crush-choose-args/$name.log' '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --mon-cluster-log-file=td/crush-choose-args/log --run-dir=td/crush-choose-args '--pid-file=td/crush-choose-args/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-31T19:02:17.379 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: cat 2026-03-31T19:02:17.380 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: get_config mon a fsid 2026-03-31T19:02:17.380 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1119: get_config: local daemon=mon 2026-03-31T19:02:17.380 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1120: get_config: local id=a 2026-03-31T19:02:17.380 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1121: get_config: local config=fsid 2026-03-31T19:02:17.380 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: get_asok_path mon.a 2026-03-31T19:02:17.381 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name=mon.a 2026-03-31T19:02:17.381 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n mon.a ']' 2026-03-31T19:02:17.381 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: get_asok_dir 2026-03-31T19:02:17.381 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:17.381 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:17.381 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: echo /tmp/ceph-asok.51199/ceph-mon.a.asok 2026-03-31T19:02:17.381 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: local daemon_asok=/tmp/ceph-asok.51199/ceph-mon.a.asok 2026-03-31T19:02:17.382 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: CEPH_ARGS= 2026-03-31T19:02:17.382 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: ceph --format json daemon /tmp/ceph-asok.51199/ceph-mon.a.asok config get fsid 2026-03-31T19:02:17.382 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: jq -r .fsid 2026-03-31T19:02:17.433 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: get_config mon a mon_host 2026-03-31T19:02:17.433 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1119: get_config: local daemon=mon 2026-03-31T19:02:17.433 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1120: get_config: local id=a 2026-03-31T19:02:17.433 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1121: get_config: local config=mon_host 2026-03-31T19:02:17.433 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: get_asok_path mon.a 2026-03-31T19:02:17.433 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name=mon.a 2026-03-31T19:02:17.434 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n mon.a ']' 2026-03-31T19:02:17.434 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: get_asok_dir 2026-03-31T19:02:17.434 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:17.434 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:17.434 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: echo /tmp/ceph-asok.51199/ceph-mon.a.asok 2026-03-31T19:02:17.434 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: local daemon_asok=/tmp/ceph-asok.51199/ceph-mon.a.asok 2026-03-31T19:02:17.434 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: CEPH_ARGS= 2026-03-31T19:02:17.434 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: ceph --format json daemon /tmp/ceph-asok.51199/ceph-mon.a.asok config get mon_host 2026-03-31T19:02:17.434 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: jq -r .mon_host 2026-03-31T19:02:17.483 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:198: TEST_move_bucket: run_mgr td/crush-choose-args x 2026-03-31T19:02:17.483 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:545: run_mgr: local dir=td/crush-choose-args 2026-03-31T19:02:17.483 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: run_mgr: shift 2026-03-31T19:02:17.483 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: run_mgr: local id=x 2026-03-31T19:02:17.483 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:548: run_mgr: shift 2026-03-31T19:02:17.483 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:549: run_mgr: local data=td/crush-choose-args/x 2026-03-31T19:02:17.483 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:551: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-31T19:02:17.597 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: get_asok_path 2026-03-31T19:02:17.597 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:17.597 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:17.597 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:17.598 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:17.598 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:17.598 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:17.598 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-31T19:02:17.599 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/crush-choose-args/x '--log-file=td/crush-choose-args/$name.log' '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --run-dir=td/crush-choose-args '--pid-file=td/crush-choose-args/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:199: TEST_move_bucket: run_osd td/crush-choose-args 0 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-choose-args 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=0 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-choose-args/0 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false ' 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-choose-args/0' 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-choose-args/0/journal' 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-choose-args' 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:17.620 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:17.621 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:17.621 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:02:17.621 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:02:17.621 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:02:17.621 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-choose-args/$name.log' 2026-03-31T19:02:17.621 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-choose-args/$name.pid' 2026-03-31T19:02:17.621 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:02:17.621 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:02:17.621 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:02:17.621 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:02:17.621 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:02:17.621 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:02:17.621 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-choose-args/0 2026-03-31T19:02:17.622 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:02:17.623 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=85aa8125-da6b-45f1-a3d3-f14208c16086 2026-03-31T19:02:17.623 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd0 85aa8125-da6b-45f1-a3d3-f14208c16086' 2026-03-31T19:02:17.623 INFO:tasks.workunit.client.0.vm05.stdout:add osd0 85aa8125-da6b-45f1-a3d3-f14208c16086 2026-03-31T19:02:17.623 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:02:17.637 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQA5GsxpwwjoJRAArmXGZjk90F5Aad04wzEHTA== 2026-03-31T19:02:17.637 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQA5GsxpwwjoJRAArmXGZjk90F5Aad04wzEHTA=="}' 2026-03-31T19:02:17.637 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new 85aa8125-da6b-45f1-a3d3-f14208c16086 -i td/crush-choose-args/0/new.json 2026-03-31T19:02:17.758 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:02:17.765 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-choose-args/0/new.json 2026-03-31T19:02:17.766 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 0 --fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/0 --osd-journal=td/crush-choose-args/0/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQA5GsxpwwjoJRAArmXGZjk90F5Aad04wzEHTA== --osd-uuid 85aa8125-da6b-45f1-a3d3-f14208c16086 2026-03-31T19:02:17.788 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:17.787+0000 7f21e6d41900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:17.790 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:17.788+0000 7f21e6d41900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:17.791 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:17.789+0000 7f21e6d41900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:17.791 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:17.790+0000 7f21e6d41900 -1 bdev(0x555acf3a6c00 td/crush-choose-args/0/block) open stat got: (1) Operation not permitted 2026-03-31T19:02:17.791 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:17.790+0000 7f21e6d41900 -1 bluestore(td/crush-choose-args/0) _read_fsid unparsable uuid 2026-03-31T19:02:18.299 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-choose-args/0/keyring 2026-03-31T19:02:18.299 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:02:18.300 INFO:tasks.workunit.client.0.vm05.stdout:adding osd0 key to auth repository 2026-03-31T19:02:18.300 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd0 key to auth repository 2026-03-31T19:02:18.300 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-choose-args/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:02:18.420 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.0 2026-03-31T19:02:18.420 INFO:tasks.workunit.client.0.vm05.stdout:start osd.0 2026-03-31T19:02:18.421 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 0 --fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/0 --osd-journal=td/crush-choose-args/0/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:02:18.421 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:02:18.421 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:02:18.422 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:02:18.441 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:18.439+0000 7f7fc9612900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:18.456 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:18.455+0000 7f7fc9612900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:18.458 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:18.456+0000 7f7fc9612900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:18.533 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 0 2026-03-31T19:02:18.533 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:02:18.533 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=0 2026-03-31T19:02:18.533 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:02:18.533 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:02:18.533 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:18.533 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:02:18.533 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:02:18.533 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:18.533 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.0 up' 2026-03-31T19:02:18.545 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:18.543+0000 7f7fc9612900 -1 Falling back to public interface 2026-03-31T19:02:18.647 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:02:18.696 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:18.695+0000 7f7fc9612900 -1 osd.0 0 log_to_monitors true 2026-03-31T19:02:19.649 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:02:19.649 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:02:19.649 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:19.649 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:02:19.649 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:19.649 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.0 up' 2026-03-31T19:02:19.768 INFO:tasks.workunit.client.0.vm05.stdout:osd.0 up in weight 1 up_from 4 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6800/940249359,v1:127.0.0.1:6801/940249359] [v2:127.0.0.1:6802/940249359,v1:127.0.0.1:6803/940249359] exists,up 85aa8125-da6b-45f1-a3d3-f14208c16086 2026-03-31T19:02:19.768 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:02:19.768 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:02:19.768 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:02:19.768 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:200: TEST_move_bucket: run_osd td/crush-choose-args 1 2026-03-31T19:02:19.768 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-choose-args 2026-03-31T19:02:19.768 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=1 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-choose-args/1 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false ' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-choose-args/1' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-choose-args/1/journal' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-choose-args' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-choose-args/$name.log' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-choose-args/$name.pid' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:02:19.769 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:02:19.770 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:02:19.770 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-choose-args/1 2026-03-31T19:02:19.770 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:02:19.771 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=c524fb69-4bec-4568-a39e-8eb5619a5df9 2026-03-31T19:02:19.771 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd1 c524fb69-4bec-4568-a39e-8eb5619a5df9' 2026-03-31T19:02:19.771 INFO:tasks.workunit.client.0.vm05.stdout:add osd1 c524fb69-4bec-4568-a39e-8eb5619a5df9 2026-03-31T19:02:19.771 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:02:19.785 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQA7GsxpwpG6LhAAKPSsB8u6vqFAT45+1SDBMQ== 2026-03-31T19:02:19.785 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQA7GsxpwpG6LhAAKPSsB8u6vqFAT45+1SDBMQ=="}' 2026-03-31T19:02:19.785 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new c524fb69-4bec-4568-a39e-8eb5619a5df9 -i td/crush-choose-args/1/new.json 2026-03-31T19:02:20.046 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:02:20.055 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-choose-args/1/new.json 2026-03-31T19:02:20.056 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 1 --fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/1 --osd-journal=td/crush-choose-args/1/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQA7GsxpwpG6LhAAKPSsB8u6vqFAT45+1SDBMQ== --osd-uuid c524fb69-4bec-4568-a39e-8eb5619a5df9 2026-03-31T19:02:20.077 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:20.076+0000 7f8ad2da3900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:20.079 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:20.078+0000 7f8ad2da3900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:20.080 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:20.079+0000 7f8ad2da3900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:20.080 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:20.079+0000 7f8ad2da3900 -1 bdev(0x55c66558ec00 td/crush-choose-args/1/block) open stat got: (1) Operation not permitted 2026-03-31T19:02:20.081 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:20.079+0000 7f8ad2da3900 -1 bluestore(td/crush-choose-args/1) _read_fsid unparsable uuid 2026-03-31T19:02:20.509 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-choose-args/1/keyring 2026-03-31T19:02:20.509 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:02:20.510 INFO:tasks.workunit.client.0.vm05.stdout:adding osd1 key to auth repository 2026-03-31T19:02:20.510 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd1 key to auth repository 2026-03-31T19:02:20.510 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-choose-args/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:02:20.855 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.1 2026-03-31T19:02:20.855 INFO:tasks.workunit.client.0.vm05.stdout:start osd.1 2026-03-31T19:02:20.855 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 1 --fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/1 --osd-journal=td/crush-choose-args/1/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:02:20.856 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:02:20.856 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:02:20.856 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:02:20.879 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:20.877+0000 7f4691a67900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:20.881 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:20.879+0000 7f4691a67900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:20.882 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:20.880+0000 7f4691a67900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:21.075 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:21.073+0000 7f4691a67900 -1 Falling back to public interface 2026-03-31T19:02:21.102 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 1 2026-03-31T19:02:21.102 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:02:21.102 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=1 2026-03-31T19:02:21.102 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:02:21.102 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:02:21.102 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:21.102 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:02:21.102 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:02:21.102 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:21.102 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.1 up' 2026-03-31T19:02:21.223 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:21.222+0000 7f4691a67900 -1 osd.1 0 log_to_monitors true 2026-03-31T19:02:21.362 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:02:22.363 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:02:22.363 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:22.363 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:02:22.363 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:02:22.363 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:22.363 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.1 up' 2026-03-31T19:02:22.612 INFO:tasks.workunit.client.0.vm05.stdout:osd.1 up in weight 1 up_from 7 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/3600629770,v1:127.0.0.1:6811/3600629770] [v2:127.0.0.1:6812/3600629770,v1:127.0.0.1:6813/3600629770] exists,up c524fb69-4bec-4568-a39e-8eb5619a5df9 2026-03-31T19:02:22.612 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:02:22.612 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:02:22.612 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:02:22.612 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:202: TEST_move_bucket: ceph osd crush weight-set create-compat 2026-03-31T19:02:22.926 INFO:tasks.workunit.client.0.vm05.stderr:compat weight-set already created 2026-03-31T19:02:22.936 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:203: TEST_move_bucket: ceph osd crush weight-set reweight-compat osd.0 2 2026-03-31T19:02:23.291 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:204: TEST_move_bucket: ceph osd crush weight-set reweight-compat osd.1 2 2026-03-31T19:02:23.600 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:205: TEST_move_bucket: ceph osd crush tree 2026-03-31T19:02:23.860 INFO:tasks.workunit.client.0.vm05.stdout:ID CLASS WEIGHT (compat) TYPE NAME 2026-03-31T19:02:23.860 INFO:tasks.workunit.client.0.vm05.stdout:-1 6.00000 root default 2026-03-31T19:02:23.860 INFO:tasks.workunit.client.0.vm05.stdout:-2 6.00000 4.00000 host HOST 2026-03-31T19:02:23.860 INFO:tasks.workunit.client.0.vm05.stdout: 0 3.00000 2.00000 osd.0 2026-03-31T19:02:23.860 INFO:tasks.workunit.client.0.vm05.stdout: 1 3.00000 2.00000 osd.1 2026-03-31T19:02:23.870 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:206: TEST_move_bucket: ceph osd crush tree 2026-03-31T19:02:23.870 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:206: TEST_move_bucket: grep HOST 2026-03-31T19:02:23.871 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:206: TEST_move_bucket: grep '6.00000 4.00000' 2026-03-31T19:02:24.131 INFO:tasks.workunit.client.0.vm05.stdout:-2 6.00000 4.00000 host HOST 2026-03-31T19:02:24.131 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:209: TEST_move_bucket: ceph osd crush add-bucket RACK rack root=default 2026-03-31T19:02:24.384 INFO:tasks.workunit.client.0.vm05.stderr:bucket 'RACK' already exists 2026-03-31T19:02:24.394 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:210: TEST_move_bucket: ceph osd crush move HOST rack=RACK 2026-03-31T19:02:24.690 INFO:tasks.workunit.client.0.vm05.stderr:no need to move item id -2 name 'HOST' to location {rack=RACK} in crush map 2026-03-31T19:02:24.700 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:211: TEST_move_bucket: ceph osd crush tree 2026-03-31T19:02:24.948 INFO:tasks.workunit.client.0.vm05.stdout:ID CLASS WEIGHT (compat) TYPE NAME 2026-03-31T19:02:24.948 INFO:tasks.workunit.client.0.vm05.stdout:-1 6.00000 root default 2026-03-31T19:02:24.948 INFO:tasks.workunit.client.0.vm05.stdout:-3 6.00000 4.00000 rack RACK 2026-03-31T19:02:24.948 INFO:tasks.workunit.client.0.vm05.stdout:-2 6.00000 4.00000 host HOST 2026-03-31T19:02:24.948 INFO:tasks.workunit.client.0.vm05.stdout: 0 3.00000 2.00000 osd.0 2026-03-31T19:02:24.948 INFO:tasks.workunit.client.0.vm05.stdout: 1 3.00000 2.00000 osd.1 2026-03-31T19:02:24.958 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:212: TEST_move_bucket: ceph osd crush tree 2026-03-31T19:02:24.958 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:212: TEST_move_bucket: grep HOST 2026-03-31T19:02:24.958 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:212: TEST_move_bucket: grep '6.00000 4.00000' 2026-03-31T19:02:25.212 INFO:tasks.workunit.client.0.vm05.stdout:-2 6.00000 4.00000 host HOST 2026-03-31T19:02:25.212 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:213: TEST_move_bucket: ceph osd crush tree 2026-03-31T19:02:25.212 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:213: TEST_move_bucket: grep RACK 2026-03-31T19:02:25.212 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:213: TEST_move_bucket: grep '6.00000 4.00000' 2026-03-31T19:02:25.472 INFO:tasks.workunit.client.0.vm05.stdout:-3 6.00000 4.00000 rack RACK 2026-03-31T19:02:25.472 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:216: TEST_move_bucket: ceph osd crush weight-set reweight-compat osd.0 1 2026-03-31T19:02:25.826 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:217: TEST_move_bucket: ceph osd crush tree 2026-03-31T19:02:26.063 INFO:tasks.workunit.client.0.vm05.stdout:ID CLASS WEIGHT (compat) TYPE NAME 2026-03-31T19:02:26.063 INFO:tasks.workunit.client.0.vm05.stdout:-1 6.00000 root default 2026-03-31T19:02:26.063 INFO:tasks.workunit.client.0.vm05.stdout:-3 6.00000 3.00000 rack RACK 2026-03-31T19:02:26.063 INFO:tasks.workunit.client.0.vm05.stdout:-2 6.00000 3.00000 host HOST 2026-03-31T19:02:26.063 INFO:tasks.workunit.client.0.vm05.stdout: 0 3.00000 1.00000 osd.0 2026-03-31T19:02:26.063 INFO:tasks.workunit.client.0.vm05.stdout: 1 3.00000 2.00000 osd.1 2026-03-31T19:02:26.074 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:218: TEST_move_bucket: ceph osd crush tree 2026-03-31T19:02:26.074 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:218: TEST_move_bucket: grep HOST 2026-03-31T19:02:26.074 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:218: TEST_move_bucket: grep '6.00000 3.00000' 2026-03-31T19:02:26.323 INFO:tasks.workunit.client.0.vm05.stdout:-2 6.00000 3.00000 host HOST 2026-03-31T19:02:26.323 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:219: TEST_move_bucket: ceph osd crush tree 2026-03-31T19:02:26.323 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:219: TEST_move_bucket: grep RACK 2026-03-31T19:02:26.323 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:219: TEST_move_bucket: grep '6.00000 3.00000' 2026-03-31T19:02:26.575 INFO:tasks.workunit.client.0.vm05.stdout:-3 6.00000 3.00000 rack RACK 2026-03-31T19:02:26.576 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:222: TEST_move_bucket: ceph config set mon osd_crush_update_weight_set true 2026-03-31T19:02:26.827 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:223: TEST_move_bucket: ceph osd crush add-bucket FOO host root=default 2026-03-31T19:02:27.133 INFO:tasks.workunit.client.0.vm05.stderr:bucket 'FOO' already exists 2026-03-31T19:02:27.142 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:224: TEST_move_bucket: ceph osd crush move osd.0 host=FOO 2026-03-31T19:02:27.438 INFO:tasks.workunit.client.0.vm05.stderr:no need to move item id 0 name 'osd.0' to location {host=FOO} in crush map 2026-03-31T19:02:27.449 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:225: TEST_move_bucket: ceph osd crush tree 2026-03-31T19:02:27.690 INFO:tasks.workunit.client.0.vm05.stdout:ID CLASS WEIGHT (compat) TYPE NAME 2026-03-31T19:02:27.690 INFO:tasks.workunit.client.0.vm05.stdout:-1 6.00000 root default 2026-03-31T19:02:27.690 INFO:tasks.workunit.client.0.vm05.stdout:-4 3.00000 3.00000 host FOO 2026-03-31T19:02:27.690 INFO:tasks.workunit.client.0.vm05.stdout: 0 3.00000 3.00000 osd.0 2026-03-31T19:02:27.690 INFO:tasks.workunit.client.0.vm05.stdout:-3 3.00000 2.00000 rack RACK 2026-03-31T19:02:27.690 INFO:tasks.workunit.client.0.vm05.stdout:-2 3.00000 2.00000 host HOST 2026-03-31T19:02:27.690 INFO:tasks.workunit.client.0.vm05.stdout: 1 3.00000 2.00000 osd.1 2026-03-31T19:02:27.700 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:226: TEST_move_bucket: ceph osd crush tree 2026-03-31T19:02:27.700 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:226: TEST_move_bucket: grep osd.0 2026-03-31T19:02:27.700 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:226: TEST_move_bucket: grep '3.00000 3.00000' 2026-03-31T19:02:27.948 INFO:tasks.workunit.client.0.vm05.stdout: 0 3.00000 3.00000 osd.0 2026-03-31T19:02:27.949 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:227: TEST_move_bucket: ceph osd crush tree 2026-03-31T19:02:27.949 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:227: TEST_move_bucket: grep HOST 2026-03-31T19:02:27.949 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:227: TEST_move_bucket: grep '3.00000 2.00000' 2026-03-31T19:02:28.201 INFO:tasks.workunit.client.0.vm05.stdout:-2 3.00000 2.00000 host HOST 2026-03-31T19:02:28.201 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:228: TEST_move_bucket: ceph osd crush tree 2026-03-31T19:02:28.201 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:228: TEST_move_bucket: grep RACK 2026-03-31T19:02:28.201 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:228: TEST_move_bucket: grep '3.00000 2.00000' 2026-03-31T19:02:28.453 INFO:tasks.workunit.client.0.vm05.stdout:-3 3.00000 2.00000 rack RACK 2026-03-31T19:02:28.453 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:231: TEST_move_bucket: ceph config set mon osd_crush_update_weight_set false 2026-03-31T19:02:28.699 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:232: TEST_move_bucket: ceph osd crush move osd.1 host=FOO 2026-03-31T19:02:28.985 INFO:tasks.workunit.client.0.vm05.stderr:no need to move item id 1 name 'osd.1' to location {host=FOO} in crush map 2026-03-31T19:02:28.995 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:233: TEST_move_bucket: ceph osd crush tree 2026-03-31T19:02:29.231 INFO:tasks.workunit.client.0.vm05.stdout:ID CLASS WEIGHT (compat) TYPE NAME 2026-03-31T19:02:29.231 INFO:tasks.workunit.client.0.vm05.stdout:-1 6.00000 root default 2026-03-31T19:02:29.231 INFO:tasks.workunit.client.0.vm05.stdout:-4 6.00000 3.00000 host FOO 2026-03-31T19:02:29.231 INFO:tasks.workunit.client.0.vm05.stdout: 0 3.00000 3.00000 osd.0 2026-03-31T19:02:29.231 INFO:tasks.workunit.client.0.vm05.stdout: 1 3.00000 0 osd.1 2026-03-31T19:02:29.231 INFO:tasks.workunit.client.0.vm05.stdout:-3 0 0 rack RACK 2026-03-31T19:02:29.231 INFO:tasks.workunit.client.0.vm05.stdout:-2 0 0 host HOST 2026-03-31T19:02:29.241 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:234: TEST_move_bucket: ceph osd crush tree 2026-03-31T19:02:29.241 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:234: TEST_move_bucket: grep osd.0 2026-03-31T19:02:29.241 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:234: TEST_move_bucket: grep '3.00000 3.00000' 2026-03-31T19:02:29.490 INFO:tasks.workunit.client.0.vm05.stdout: 0 3.00000 3.00000 osd.0 2026-03-31T19:02:29.490 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:235: TEST_move_bucket: ceph osd crush tree 2026-03-31T19:02:29.490 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:235: TEST_move_bucket: grep osd.1 2026-03-31T19:02:29.490 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:235: TEST_move_bucket: grep '3.00000 0' 2026-03-31T19:02:29.747 INFO:tasks.workunit.client.0.vm05.stdout: 1 3.00000 0 osd.1 2026-03-31T19:02:29.747 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:236: TEST_move_bucket: ceph osd crush tree 2026-03-31T19:02:29.747 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:236: TEST_move_bucket: grep FOO 2026-03-31T19:02:29.747 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:236: TEST_move_bucket: grep '6.00000 3.00000' 2026-03-31T19:02:29.997 INFO:tasks.workunit.client.0.vm05.stdout:-4 6.00000 3.00000 host FOO 2026-03-31T19:02:29.998 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:41: run: teardown td/crush-choose-args 2026-03-31T19:02:29.998 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:155: teardown: local dir=td/crush-choose-args 2026-03-31T19:02:29.998 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:156: teardown: local dumplogs= 2026-03-31T19:02:29.998 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:157: teardown: kill_daemons td/crush-choose-args KILL 2026-03-31T19:02:29.998 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:02:29.998 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:02:29.998 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:02:29.998 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:02:29.998 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:02:30.106 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:02:30.106 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: uname 2026-03-31T19:02:30.107 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-31T19:02:30.107 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: stat -f -c %T . 2026-03-31T19:02:30.108 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: '[' xfs == btrfs ']' 2026-03-31T19:02:30.108 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:162: teardown: local cores=no 2026-03-31T19:02:30.108 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: sysctl -n kernel.core_pattern 2026-03-31T19:02:30.109 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:30.109 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: '[' / = '|' ']' 2026-03-31T19:02:30.110 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: grep -q '^core\|core$' 2026-03-31T19:02:30.110 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:30.110 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:02:30.111 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: '[' no = yes -o '' = 1 ']' 2026-03-31T19:02:30.111 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: rm -fr td/crush-choose-args 2026-03-31T19:02:30.118 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: get_asok_dir 2026-03-31T19:02:30.118 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:30.119 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:30.119 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: rm -rf /tmp/ceph-asok.51199 2026-03-31T19:02:30.120 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:191: teardown: '[' no = yes ']' 2026-03-31T19:02:30.120 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: return 0 2026-03-31T19:02:30.120 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:38: run: for func in $funcs 2026-03-31T19:02:30.120 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:39: run: setup td/crush-choose-args 2026-03-31T19:02:30.120 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:122: setup: local dir=td/crush-choose-args 2026-03-31T19:02:30.120 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:123: setup: teardown td/crush-choose-args 2026-03-31T19:02:30.120 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:155: teardown: local dir=td/crush-choose-args 2026-03-31T19:02:30.120 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:156: teardown: local dumplogs= 2026-03-31T19:02:30.120 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:157: teardown: kill_daemons td/crush-choose-args KILL 2026-03-31T19:02:30.120 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:02:30.120 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:02:30.120 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:02:30.120 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:02:30.120 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:02:30.122 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:02:30.122 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: uname 2026-03-31T19:02:30.123 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-31T19:02:30.123 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: stat -f -c %T . 2026-03-31T19:02:30.124 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: '[' xfs == btrfs ']' 2026-03-31T19:02:30.124 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:162: teardown: local cores=no 2026-03-31T19:02:30.125 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: sysctl -n kernel.core_pattern 2026-03-31T19:02:30.125 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:30.125 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: '[' / = '|' ']' 2026-03-31T19:02:30.125 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: grep -q '^core\|core$' 2026-03-31T19:02:30.126 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:30.126 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:02:30.127 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: '[' no = yes -o '' = 1 ']' 2026-03-31T19:02:30.127 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: rm -fr td/crush-choose-args 2026-03-31T19:02:30.128 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: get_asok_dir 2026-03-31T19:02:30.128 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:30.128 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:30.129 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: rm -rf /tmp/ceph-asok.51199 2026-03-31T19:02:30.129 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:191: teardown: '[' no = yes ']' 2026-03-31T19:02:30.129 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: return 0 2026-03-31T19:02:30.129 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:124: setup: mkdir -p td/crush-choose-args 2026-03-31T19:02:30.131 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:125: setup: get_asok_dir 2026-03-31T19:02:30.131 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:30.131 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:30.131 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:125: setup: mkdir -p /tmp/ceph-asok.51199 2026-03-31T19:02:30.132 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:126: setup: ulimit -n 2026-03-31T19:02:30.133 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:126: setup: '[' 4096 -le 1024 ']' 2026-03-31T19:02:30.133 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:129: setup: '[' -z '' ']' 2026-03-31T19:02:30.133 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:130: setup: trap 'teardown td/crush-choose-args 1' TERM HUP INT 2026-03-31T19:02:30.133 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:40: run: TEST_no_update_weight_set td/crush-choose-args 2026-03-31T19:02:30.133 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:104: TEST_no_update_weight_set: local dir=td/crush-choose-args 2026-03-31T19:02:30.133 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:106: TEST_no_update_weight_set: ORIG_CEPH_ARGS='--fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false ' 2026-03-31T19:02:30.133 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:107: TEST_no_update_weight_set: CEPH_ARGS+='--osd-crush-update-weight-set=false ' 2026-03-31T19:02:30.133 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:109: TEST_no_update_weight_set: run_mon td/crush-choose-args a 2026-03-31T19:02:30.133 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:439: run_mon: local dir=td/crush-choose-args 2026-03-31T19:02:30.133 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:440: run_mon: shift 2026-03-31T19:02:30.133 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:441: run_mon: local id=a 2026-03-31T19:02:30.133 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:442: run_mon: shift 2026-03-31T19:02:30.133 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:443: run_mon: local data=td/crush-choose-args/a 2026-03-31T19:02:30.133 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:446: run_mon: ceph-mon --id a --mkfs --mon-data=td/crush-choose-args/a --run-dir=td/crush-choose-args 2026-03-31T19:02:30.160 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:453: run_mon: get_asok_path 2026-03-31T19:02:30.160 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:30.160 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:30.160 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:30.160 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:30.160 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:30.160 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:30.161 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:453: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/crush-choose-args/a '--log-file=td/crush-choose-args/$name.log' '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --mon-cluster-log-file=td/crush-choose-args/log --run-dir=td/crush-choose-args '--pid-file=td/crush-choose-args/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-31T19:02:30.194 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: cat 2026-03-31T19:02:30.194 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: get_config mon a fsid 2026-03-31T19:02:30.194 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1119: get_config: local daemon=mon 2026-03-31T19:02:30.194 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1120: get_config: local id=a 2026-03-31T19:02:30.194 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1121: get_config: local config=fsid 2026-03-31T19:02:30.195 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: get_asok_path mon.a 2026-03-31T19:02:30.195 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name=mon.a 2026-03-31T19:02:30.195 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n mon.a ']' 2026-03-31T19:02:30.195 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: get_asok_dir 2026-03-31T19:02:30.195 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:30.195 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:30.195 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: echo /tmp/ceph-asok.51199/ceph-mon.a.asok 2026-03-31T19:02:30.195 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: local daemon_asok=/tmp/ceph-asok.51199/ceph-mon.a.asok 2026-03-31T19:02:30.196 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: CEPH_ARGS= 2026-03-31T19:02:30.196 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: ceph --format json daemon /tmp/ceph-asok.51199/ceph-mon.a.asok config get fsid 2026-03-31T19:02:30.196 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: jq -r .fsid 2026-03-31T19:02:30.247 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: get_config mon a mon_host 2026-03-31T19:02:30.247 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1119: get_config: local daemon=mon 2026-03-31T19:02:30.247 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1120: get_config: local id=a 2026-03-31T19:02:30.247 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1121: get_config: local config=mon_host 2026-03-31T19:02:30.247 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: get_asok_path mon.a 2026-03-31T19:02:30.247 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name=mon.a 2026-03-31T19:02:30.247 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n mon.a ']' 2026-03-31T19:02:30.247 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: get_asok_dir 2026-03-31T19:02:30.248 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:30.248 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:30.248 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: echo /tmp/ceph-asok.51199/ceph-mon.a.asok 2026-03-31T19:02:30.248 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: local daemon_asok=/tmp/ceph-asok.51199/ceph-mon.a.asok 2026-03-31T19:02:30.248 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: CEPH_ARGS= 2026-03-31T19:02:30.248 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: ceph --format json daemon /tmp/ceph-asok.51199/ceph-mon.a.asok config get mon_host 2026-03-31T19:02:30.248 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: jq -r .mon_host 2026-03-31T19:02:30.293 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:110: TEST_no_update_weight_set: run_mgr td/crush-choose-args x 2026-03-31T19:02:30.293 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:545: run_mgr: local dir=td/crush-choose-args 2026-03-31T19:02:30.293 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: run_mgr: shift 2026-03-31T19:02:30.293 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: run_mgr: local id=x 2026-03-31T19:02:30.294 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:548: run_mgr: shift 2026-03-31T19:02:30.294 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:549: run_mgr: local data=td/crush-choose-args/x 2026-03-31T19:02:30.294 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:551: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-31T19:02:30.407 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: get_asok_path 2026-03-31T19:02:30.408 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:30.408 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:30.408 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:30.408 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:30.408 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:30.408 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:30.408 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-31T19:02:30.409 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/crush-choose-args/x '--log-file=td/crush-choose-args/$name.log' '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --run-dir=td/crush-choose-args '--pid-file=td/crush-choose-args/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:111: TEST_no_update_weight_set: run_osd td/crush-choose-args 0 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-choose-args 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=0 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-choose-args/0 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false ' 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-choose-args/0' 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-choose-args/0/journal' 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-choose-args' 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:30.430 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:30.431 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:30.431 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:02:30.431 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:02:30.431 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:02:30.431 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-choose-args/$name.log' 2026-03-31T19:02:30.431 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-choose-args/$name.pid' 2026-03-31T19:02:30.431 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:02:30.431 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:02:30.431 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:02:30.431 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:02:30.431 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:02:30.431 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:02:30.431 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-choose-args/0 2026-03-31T19:02:30.432 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:02:30.433 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=6a9c7dc9-356b-46e4-a8b6-5bfbd09a5da6 2026-03-31T19:02:30.433 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd0 6a9c7dc9-356b-46e4-a8b6-5bfbd09a5da6' 2026-03-31T19:02:30.433 INFO:tasks.workunit.client.0.vm05.stdout:add osd0 6a9c7dc9-356b-46e4-a8b6-5bfbd09a5da6 2026-03-31T19:02:30.433 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:02:30.446 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQBGGsxp8fmNGhAA9xkxcgv1VhGvbBGubFKe0g== 2026-03-31T19:02:30.446 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQBGGsxp8fmNGhAA9xkxcgv1VhGvbBGubFKe0g=="}' 2026-03-31T19:02:30.446 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new 6a9c7dc9-356b-46e4-a8b6-5bfbd09a5da6 -i td/crush-choose-args/0/new.json 2026-03-31T19:02:30.562 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:02:30.569 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-choose-args/0/new.json 2026-03-31T19:02:30.570 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 0 --fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/0 --osd-journal=td/crush-choose-args/0/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBGGsxp8fmNGhAA9xkxcgv1VhGvbBGubFKe0g== --osd-uuid 6a9c7dc9-356b-46e4-a8b6-5bfbd09a5da6 2026-03-31T19:02:30.592 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:30.590+0000 7f8405610900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:30.595 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:30.594+0000 7f8405610900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:30.597 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:30.595+0000 7f8405610900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:30.597 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:30.595+0000 7f8405610900 -1 bdev(0x555f4b220c00 td/crush-choose-args/0/block) open stat got: (1) Operation not permitted 2026-03-31T19:02:30.597 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:30.595+0000 7f8405610900 -1 bluestore(td/crush-choose-args/0) _read_fsid unparsable uuid 2026-03-31T19:02:31.072 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-choose-args/0/keyring 2026-03-31T19:02:31.072 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:02:31.073 INFO:tasks.workunit.client.0.vm05.stdout:adding osd0 key to auth repository 2026-03-31T19:02:31.073 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd0 key to auth repository 2026-03-31T19:02:31.073 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-choose-args/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:02:31.185 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.0 2026-03-31T19:02:31.185 INFO:tasks.workunit.client.0.vm05.stdout:start osd.0 2026-03-31T19:02:31.185 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 0 --fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/0 --osd-journal=td/crush-choose-args/0/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:02:31.185 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:02:31.185 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:02:31.186 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:02:31.205 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:31.204+0000 7fa6641dc900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:31.206 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:31.205+0000 7fa6641dc900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:31.208 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:31.206+0000 7fa6641dc900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:31.295 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 0 2026-03-31T19:02:31.295 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:02:31.295 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=0 2026-03-31T19:02:31.295 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:02:31.295 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:02:31.295 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:31.295 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:02:31.295 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:02:31.295 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:31.295 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.0 up' 2026-03-31T19:02:31.337 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:31.335+0000 7fa6641dc900 -1 Falling back to public interface 2026-03-31T19:02:31.411 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:02:31.474 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:31.472+0000 7fa6641dc900 -1 osd.0 0 log_to_monitors true 2026-03-31T19:02:32.412 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:02:32.412 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:02:32.413 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:32.413 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:02:32.413 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:32.413 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.0 up' 2026-03-31T19:02:32.522 INFO:tasks.workunit.client.0.vm05.stdout:osd.0 up in weight 1 up_from 4 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6800/295416427,v1:127.0.0.1:6801/295416427] [v2:127.0.0.1:6802/295416427,v1:127.0.0.1:6803/295416427] exists,up 6a9c7dc9-356b-46e4-a8b6-5bfbd09a5da6 2026-03-31T19:02:32.522 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:02:32.522 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:02:32.522 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:02:32.522 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:113: TEST_no_update_weight_set: ceph osd set-require-min-compat-client luminous 2026-03-31T19:02:32.803 INFO:tasks.workunit.client.0.vm05.stderr:set require_min_compat_client to luminous 2026-03-31T19:02:32.813 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:114: TEST_no_update_weight_set: ceph osd crush tree 2026-03-31T19:02:33.047 INFO:tasks.workunit.client.0.vm05.stdout:ID CLASS WEIGHT TYPE NAME 2026-03-31T19:02:33.047 INFO:tasks.workunit.client.0.vm05.stdout:-1 3.00000 root default 2026-03-31T19:02:33.047 INFO:tasks.workunit.client.0.vm05.stdout:-2 3.00000 host HOST 2026-03-31T19:02:33.047 INFO:tasks.workunit.client.0.vm05.stdout: 0 3.00000 osd.0 2026-03-31T19:02:33.056 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:115: TEST_no_update_weight_set: ceph osd getcrushmap 2026-03-31T19:02:33.283 INFO:tasks.workunit.client.0.vm05.stderr:2 2026-03-31T19:02:33.290 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:116: TEST_no_update_weight_set: crushtool -d td/crush-choose-args/map -o td/crush-choose-args/map.txt 2026-03-31T19:02:33.303 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:117: TEST_no_update_weight_set: sed -i -e '/end crush map/d' td/crush-choose-args/map.txt 2026-03-31T19:02:33.304 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:118: TEST_no_update_weight_set: cat 2026-03-31T19:02:33.305 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:141: TEST_no_update_weight_set: crushtool -c td/crush-choose-args/map.txt -o td/crush-choose-args/map-new 2026-03-31T19:02:33.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:142: TEST_no_update_weight_set: ceph osd setcrushmap -i td/crush-choose-args/map-new 2026-03-31T19:02:33.679 INFO:tasks.workunit.client.0.vm05.stderr:4 2026-03-31T19:02:33.688 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:143: TEST_no_update_weight_set: ceph osd crush tree 2026-03-31T19:02:33.922 INFO:tasks.workunit.client.0.vm05.stdout:ID CLASS WEIGHT 0 TYPE NAME 2026-03-31T19:02:33.922 INFO:tasks.workunit.client.0.vm05.stdout:-1 3.00000 root default 2026-03-31T19:02:33.922 INFO:tasks.workunit.client.0.vm05.stdout:-2 3.00000 2.00000 host HOST 2026-03-31T19:02:33.922 INFO:tasks.workunit.client.0.vm05.stdout: 0 3.00000 2.00000 osd.0 2026-03-31T19:02:33.931 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:146: TEST_no_update_weight_set: run_osd td/crush-choose-args 1 2026-03-31T19:02:33.931 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-choose-args 2026-03-31T19:02:33.931 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:02:33.931 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=1 2026-03-31T19:02:33.931 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:02:33.931 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-choose-args/1 2026-03-31T19:02:33.931 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false ' 2026-03-31T19:02:33.931 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:02:33.931 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:02:33.931 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:02:33.931 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-choose-args/1' 2026-03-31T19:02:33.931 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-choose-args/1/journal' 2026-03-31T19:02:33.931 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:02:33.931 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:02:33.931 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-choose-args' 2026-03-31T19:02:33.931 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:02:33.931 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:33.931 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:33.931 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:33.932 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:33.932 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:33.932 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:33.932 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:33.932 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:02:33.932 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:02:33.932 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:02:33.932 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-choose-args/$name.log' 2026-03-31T19:02:33.932 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-choose-args/$name.pid' 2026-03-31T19:02:33.932 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:02:33.932 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:02:33.932 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:02:33.932 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:02:33.932 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:02:33.932 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:02:33.932 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-choose-args/1 2026-03-31T19:02:33.933 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:02:33.934 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=d0247a6d-3e20-49c5-9220-78705320390c 2026-03-31T19:02:33.934 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd1 d0247a6d-3e20-49c5-9220-78705320390c' 2026-03-31T19:02:33.934 INFO:tasks.workunit.client.0.vm05.stdout:add osd1 d0247a6d-3e20-49c5-9220-78705320390c 2026-03-31T19:02:33.934 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:02:33.948 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQBJGsxp9mB0OBAAv7qMNcwNkrOgvkVg4f5V5w== 2026-03-31T19:02:33.948 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQBJGsxp9mB0OBAAv7qMNcwNkrOgvkVg4f5V5w=="}' 2026-03-31T19:02:33.948 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new d0247a6d-3e20-49c5-9220-78705320390c -i td/crush-choose-args/1/new.json 2026-03-31T19:02:34.191 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:02:34.200 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-choose-args/1/new.json 2026-03-31T19:02:34.201 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 1 --fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/1 --osd-journal=td/crush-choose-args/1/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBJGsxp9mB0OBAAv7qMNcwNkrOgvkVg4f5V5w== --osd-uuid d0247a6d-3e20-49c5-9220-78705320390c 2026-03-31T19:02:34.221 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:34.220+0000 7f3a7a9cf900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:34.223 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:34.222+0000 7f3a7a9cf900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:34.224 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:34.222+0000 7f3a7a9cf900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:34.224 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:34.223+0000 7f3a7a9cf900 -1 bdev(0x55f2f7a0ec00 td/crush-choose-args/1/block) open stat got: (1) Operation not permitted 2026-03-31T19:02:34.224 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:34.223+0000 7f3a7a9cf900 -1 bluestore(td/crush-choose-args/1) _read_fsid unparsable uuid 2026-03-31T19:02:34.634 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-choose-args/1/keyring 2026-03-31T19:02:34.634 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:02:34.635 INFO:tasks.workunit.client.0.vm05.stdout:adding osd1 key to auth repository 2026-03-31T19:02:34.635 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd1 key to auth repository 2026-03-31T19:02:34.635 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-choose-args/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:02:34.976 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.1 2026-03-31T19:02:34.976 INFO:tasks.workunit.client.0.vm05.stdout:start osd.1 2026-03-31T19:02:34.976 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 1 --fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/1 --osd-journal=td/crush-choose-args/1/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:02:34.976 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:02:34.976 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:02:34.976 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:02:34.997 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:34.996+0000 7f920d791900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:34.999 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:34.998+0000 7f920d791900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:35.000 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:34.999+0000 7f920d791900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:35.123 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:35.121+0000 7f920d791900 -1 Falling back to public interface 2026-03-31T19:02:35.227 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:02:35.227 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 1 2026-03-31T19:02:35.227 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:02:35.227 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=1 2026-03-31T19:02:35.228 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:02:35.228 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:02:35.228 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:35.228 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:02:35.228 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:35.228 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.1 up' 2026-03-31T19:02:35.300 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:35.298+0000 7f920d791900 -1 osd.1 0 log_to_monitors true 2026-03-31T19:02:35.475 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:02:36.476 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:02:36.477 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:02:36.477 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:36.477 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:02:36.477 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:36.477 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.1 up' 2026-03-31T19:02:36.716 INFO:tasks.workunit.client.0.vm05.stdout:osd.1 up in weight 1 up_from 11 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/901446773,v1:127.0.0.1:6811/901446773] [v2:127.0.0.1:6812/901446773,v1:127.0.0.1:6813/901446773] exists,up d0247a6d-3e20-49c5-9220-78705320390c 2026-03-31T19:02:36.717 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:02:36.717 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:02:36.717 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:02:36.717 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:147: TEST_no_update_weight_set: ceph osd crush tree 2026-03-31T19:02:36.959 INFO:tasks.workunit.client.0.vm05.stdout:ID CLASS WEIGHT 0 TYPE NAME 2026-03-31T19:02:36.960 INFO:tasks.workunit.client.0.vm05.stdout:-1 6.00000 root default 2026-03-31T19:02:36.960 INFO:tasks.workunit.client.0.vm05.stdout:-2 6.00000 2.00000 host HOST 2026-03-31T19:02:36.960 INFO:tasks.workunit.client.0.vm05.stdout: 0 3.00000 2.00000 osd.0 2026-03-31T19:02:36.960 INFO:tasks.workunit.client.0.vm05.stdout: 1 3.00000 0 osd.1 2026-03-31T19:02:36.969 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:148: TEST_no_update_weight_set: ceph osd getcrushmap 2026-03-31T19:02:37.207 INFO:tasks.workunit.client.0.vm05.stderr:5 2026-03-31T19:02:37.216 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:149: TEST_no_update_weight_set: crushtool -d td/crush-choose-args/map-one-more -o td/crush-choose-args/map-one-more.txt 2026-03-31T19:02:37.230 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:150: TEST_no_update_weight_set: cat td/crush-choose-args/map-one-more.txt 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:# begin crush map 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:tunable choose_local_tries 0 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:tunable choose_local_fallback_tries 0 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:tunable choose_total_tries 50 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:tunable chooseleaf_descend_once 1 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:tunable chooseleaf_vary_r 1 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:tunable chooseleaf_stable 1 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:tunable straw_calc_version 1 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:tunable allowed_bucket_algs 54 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout: 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:# devices 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:device 0 osd.0 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:device 1 osd.1 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout: 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:# types 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:type 0 osd 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:type 1 host 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:type 2 chassis 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:type 3 rack 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:type 4 row 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:type 5 pdu 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:type 6 pod 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:type 7 room 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:type 8 datacenter 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:type 9 zone 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:type 10 region 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout:type 11 root 2026-03-31T19:02:37.231 INFO:tasks.workunit.client.0.vm05.stdout: 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout:# buckets 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout:host HOST { 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: id -2 # do not change unnecessarily 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: # weight 6.00000 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: alg straw2 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: hash 0 # rjenkins1 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: item osd.0 weight 3.00000 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: item osd.1 weight 3.00000 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout:} 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout:root default { 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: id -1 # do not change unnecessarily 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: # weight 6.00000 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: alg straw2 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: hash 0 # rjenkins1 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: item HOST weight 6.00000 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout:} 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout:# rules 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout:rule replicated_rule { 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: id 0 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: type replicated 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: step take default 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: step choose firstn 0 type osd 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: step emit 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout:} 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout:# choose_args 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout:choose_args 0 { 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: { 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: bucket_id -1 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: weight_set [ 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: [ 2.00000 ] 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: [ 1.00000 ] 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: ] 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: ids [ -10 ] 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: } 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: { 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: bucket_id -2 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: weight_set [ 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: [ 2.00000 0.00000 ] 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: [ 1.00000 0.00000 ] 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: ] 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: ids [ -20 1 ] 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: } 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout:} 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout: 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stdout:# end crush map 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:151: TEST_no_update_weight_set: diff -u td/crush-choose-args/map-one-more.txt /home/ubuntu/cephtest/clone.client.0/src/test/crush/crush-choose-args-expected-one-more-0.txt 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:153: TEST_no_update_weight_set: destroy_osd td/crush-choose-args 1 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:776: destroy_osd: local dir=td/crush-choose-args 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:777: destroy_osd: local id=1 2026-03-31T19:02:37.232 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:779: destroy_osd: ceph osd out osd.1 2026-03-31T19:02:37.531 INFO:tasks.workunit.client.0.vm05.stderr:osd.1 is already out. 2026-03-31T19:02:37.541 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:780: destroy_osd: kill_daemons td/crush-choose-args TERM osd.1 2026-03-31T19:02:37.541 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:02:37.541 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:02:37.541 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:02:37.541 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:02:37.541 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:02:37.646 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:02:37.646 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:781: destroy_osd: ceph osd down osd.1 2026-03-31T19:02:37.892 INFO:tasks.workunit.client.0.vm05.stderr:osd.1 is already down. 2026-03-31T19:02:37.902 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:782: destroy_osd: ceph osd purge osd.1 --yes-i-really-mean-it 2026-03-31T19:02:38.148 INFO:tasks.workunit.client.0.vm05.stderr:osd.1 does not exist 2026-03-31T19:02:38.158 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:783: destroy_osd: teardown td/crush-choose-args/1 2026-03-31T19:02:38.158 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:155: teardown: local dir=td/crush-choose-args/1 2026-03-31T19:02:38.158 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:156: teardown: local dumplogs= 2026-03-31T19:02:38.158 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:157: teardown: kill_daemons td/crush-choose-args/1 KILL 2026-03-31T19:02:38.159 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:02:38.159 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:02:38.159 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:02:38.159 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:02:38.159 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:02:38.160 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:02:38.161 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: uname 2026-03-31T19:02:38.161 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-31T19:02:38.162 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: stat -f -c %T . 2026-03-31T19:02:38.162 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: '[' xfs == btrfs ']' 2026-03-31T19:02:38.162 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:162: teardown: local cores=no 2026-03-31T19:02:38.163 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: sysctl -n kernel.core_pattern 2026-03-31T19:02:38.163 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:38.164 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: '[' / = '|' ']' 2026-03-31T19:02:38.164 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: grep -q '^core\|core$' 2026-03-31T19:02:38.164 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:38.165 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:02:38.166 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: '[' no = yes -o '' = 1 ']' 2026-03-31T19:02:38.166 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: rm -fr td/crush-choose-args/1 2026-03-31T19:02:38.168 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: get_asok_dir 2026-03-31T19:02:38.168 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:38.168 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:38.168 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: rm -rf /tmp/ceph-asok.51199 2026-03-31T19:02:38.170 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:191: teardown: '[' no = yes ']' 2026-03-31T19:02:38.170 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: return 0 2026-03-31T19:02:38.170 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:784: destroy_osd: rm -fr td/crush-choose-args/1 2026-03-31T19:02:38.171 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:154: TEST_no_update_weight_set: ceph osd crush tree 2026-03-31T19:02:38.409 INFO:tasks.workunit.client.0.vm05.stdout:ID CLASS WEIGHT 0 TYPE NAME 2026-03-31T19:02:38.409 INFO:tasks.workunit.client.0.vm05.stdout:-1 3.00000 root default 2026-03-31T19:02:38.409 INFO:tasks.workunit.client.0.vm05.stdout:-2 3.00000 2.00000 host HOST 2026-03-31T19:02:38.409 INFO:tasks.workunit.client.0.vm05.stdout: 0 3.00000 2.00000 osd.0 2026-03-31T19:02:38.419 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:155: TEST_no_update_weight_set: ceph osd getcrushmap 2026-03-31T19:02:38.771 INFO:tasks.workunit.client.0.vm05.stderr:6 2026-03-31T19:02:38.781 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:156: TEST_no_update_weight_set: crushtool -d td/crush-choose-args/map-one-less -o td/crush-choose-args/map-one-less.txt 2026-03-31T19:02:38.796 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:157: TEST_no_update_weight_set: diff -u td/crush-choose-args/map-one-less.txt td/crush-choose-args/map.txt 2026-03-31T19:02:38.797 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:159: TEST_no_update_weight_set: CEPH_ARGS='--fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false ' 2026-03-31T19:02:38.797 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:41: run: teardown td/crush-choose-args 2026-03-31T19:02:38.797 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:155: teardown: local dir=td/crush-choose-args 2026-03-31T19:02:38.797 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:156: teardown: local dumplogs= 2026-03-31T19:02:38.797 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:157: teardown: kill_daemons td/crush-choose-args KILL 2026-03-31T19:02:38.797 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:02:38.797 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:02:38.797 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:02:38.797 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:02:38.797 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:02:38.906 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:02:38.906 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: uname 2026-03-31T19:02:38.907 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-31T19:02:38.907 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: stat -f -c %T . 2026-03-31T19:02:38.908 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: '[' xfs == btrfs ']' 2026-03-31T19:02:38.908 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:162: teardown: local cores=no 2026-03-31T19:02:38.908 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: sysctl -n kernel.core_pattern 2026-03-31T19:02:38.909 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:38.909 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: '[' / = '|' ']' 2026-03-31T19:02:38.909 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: grep -q '^core\|core$' 2026-03-31T19:02:38.909 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:38.910 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:02:38.910 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: '[' no = yes -o '' = 1 ']' 2026-03-31T19:02:38.911 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: rm -fr td/crush-choose-args 2026-03-31T19:02:38.915 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: get_asok_dir 2026-03-31T19:02:38.915 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:38.916 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:38.916 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: rm -rf /tmp/ceph-asok.51199 2026-03-31T19:02:38.917 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:191: teardown: '[' no = yes ']' 2026-03-31T19:02:38.917 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: return 0 2026-03-31T19:02:38.917 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:38: run: for func in $funcs 2026-03-31T19:02:38.917 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:39: run: setup td/crush-choose-args 2026-03-31T19:02:38.917 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:122: setup: local dir=td/crush-choose-args 2026-03-31T19:02:38.917 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:123: setup: teardown td/crush-choose-args 2026-03-31T19:02:38.917 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:155: teardown: local dir=td/crush-choose-args 2026-03-31T19:02:38.917 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:156: teardown: local dumplogs= 2026-03-31T19:02:38.917 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:157: teardown: kill_daemons td/crush-choose-args KILL 2026-03-31T19:02:38.917 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:02:38.917 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:02:38.917 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:02:38.917 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:02:38.917 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:02:38.919 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:02:38.919 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: uname 2026-03-31T19:02:38.920 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-31T19:02:38.920 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: stat -f -c %T . 2026-03-31T19:02:38.921 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: '[' xfs == btrfs ']' 2026-03-31T19:02:38.921 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:162: teardown: local cores=no 2026-03-31T19:02:38.921 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: sysctl -n kernel.core_pattern 2026-03-31T19:02:38.922 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:38.922 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: '[' / = '|' ']' 2026-03-31T19:02:38.922 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: grep -q '^core\|core$' 2026-03-31T19:02:38.922 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:38.923 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:02:38.924 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: '[' no = yes -o '' = 1 ']' 2026-03-31T19:02:38.924 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: rm -fr td/crush-choose-args 2026-03-31T19:02:38.925 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: get_asok_dir 2026-03-31T19:02:38.925 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:38.925 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:38.925 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: rm -rf /tmp/ceph-asok.51199 2026-03-31T19:02:38.926 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:191: teardown: '[' no = yes ']' 2026-03-31T19:02:38.926 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: return 0 2026-03-31T19:02:38.926 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:124: setup: mkdir -p td/crush-choose-args 2026-03-31T19:02:38.927 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:125: setup: get_asok_dir 2026-03-31T19:02:38.927 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:38.927 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:38.927 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:125: setup: mkdir -p /tmp/ceph-asok.51199 2026-03-31T19:02:38.928 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:126: setup: ulimit -n 2026-03-31T19:02:38.929 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:126: setup: '[' 4096 -le 1024 ']' 2026-03-31T19:02:38.929 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:129: setup: '[' -z '' ']' 2026-03-31T19:02:38.929 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:130: setup: trap 'teardown td/crush-choose-args 1' TERM HUP INT 2026-03-31T19:02:38.929 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:40: run: TEST_reweight td/crush-choose-args 2026-03-31T19:02:38.929 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:164: TEST_reweight: local dir=td/crush-choose-args 2026-03-31T19:02:38.929 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:166: TEST_reweight: ORIG_CEPH_ARGS='--fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false ' 2026-03-31T19:02:38.929 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:167: TEST_reweight: CEPH_ARGS+='--osd-crush-update-weight-set=false ' 2026-03-31T19:02:38.929 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:169: TEST_reweight: run_mon td/crush-choose-args a 2026-03-31T19:02:38.929 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:439: run_mon: local dir=td/crush-choose-args 2026-03-31T19:02:38.929 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:440: run_mon: shift 2026-03-31T19:02:38.929 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:441: run_mon: local id=a 2026-03-31T19:02:38.929 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:442: run_mon: shift 2026-03-31T19:02:38.929 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:443: run_mon: local data=td/crush-choose-args/a 2026-03-31T19:02:38.929 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:446: run_mon: ceph-mon --id a --mkfs --mon-data=td/crush-choose-args/a --run-dir=td/crush-choose-args 2026-03-31T19:02:39.069 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:453: run_mon: get_asok_path 2026-03-31T19:02:39.069 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:39.069 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:39.069 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:39.069 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:39.069 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:39.070 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:39.070 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:453: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/crush-choose-args/a '--log-file=td/crush-choose-args/$name.log' '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --mon-cluster-log-file=td/crush-choose-args/log --run-dir=td/crush-choose-args '--pid-file=td/crush-choose-args/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-31T19:02:39.103 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: cat 2026-03-31T19:02:39.104 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: get_config mon a fsid 2026-03-31T19:02:39.104 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1119: get_config: local daemon=mon 2026-03-31T19:02:39.104 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1120: get_config: local id=a 2026-03-31T19:02:39.104 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1121: get_config: local config=fsid 2026-03-31T19:02:39.104 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: get_asok_path mon.a 2026-03-31T19:02:39.104 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name=mon.a 2026-03-31T19:02:39.104 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n mon.a ']' 2026-03-31T19:02:39.105 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: get_asok_dir 2026-03-31T19:02:39.105 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:39.105 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:39.105 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: echo /tmp/ceph-asok.51199/ceph-mon.a.asok 2026-03-31T19:02:39.105 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: local daemon_asok=/tmp/ceph-asok.51199/ceph-mon.a.asok 2026-03-31T19:02:39.105 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: CEPH_ARGS= 2026-03-31T19:02:39.105 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: ceph --format json daemon /tmp/ceph-asok.51199/ceph-mon.a.asok config get fsid 2026-03-31T19:02:39.105 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: jq -r .fsid 2026-03-31T19:02:39.153 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: get_config mon a mon_host 2026-03-31T19:02:39.153 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1119: get_config: local daemon=mon 2026-03-31T19:02:39.153 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1120: get_config: local id=a 2026-03-31T19:02:39.153 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1121: get_config: local config=mon_host 2026-03-31T19:02:39.153 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: get_asok_path mon.a 2026-03-31T19:02:39.153 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name=mon.a 2026-03-31T19:02:39.153 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n mon.a ']' 2026-03-31T19:02:39.154 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: get_asok_dir 2026-03-31T19:02:39.154 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:39.154 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:39.154 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: echo /tmp/ceph-asok.51199/ceph-mon.a.asok 2026-03-31T19:02:39.154 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: local daemon_asok=/tmp/ceph-asok.51199/ceph-mon.a.asok 2026-03-31T19:02:39.154 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: CEPH_ARGS= 2026-03-31T19:02:39.154 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: ceph --format json daemon /tmp/ceph-asok.51199/ceph-mon.a.asok config get mon_host 2026-03-31T19:02:39.154 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: jq -r .mon_host 2026-03-31T19:02:39.202 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:170: TEST_reweight: run_mgr td/crush-choose-args x 2026-03-31T19:02:39.202 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:545: run_mgr: local dir=td/crush-choose-args 2026-03-31T19:02:39.202 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:546: run_mgr: shift 2026-03-31T19:02:39.202 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:547: run_mgr: local id=x 2026-03-31T19:02:39.202 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:548: run_mgr: shift 2026-03-31T19:02:39.202 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:549: run_mgr: local data=td/crush-choose-args/x 2026-03-31T19:02:39.202 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:551: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-31T19:02:39.318 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: get_asok_path 2026-03-31T19:02:39.318 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:39.318 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:39.318 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:39.318 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:39.318 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:39.318 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:39.319 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-31T19:02:39.319 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/crush-choose-args/x '--log-file=td/crush-choose-args/$name.log' '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --run-dir=td/crush-choose-args '--pid-file=td/crush-choose-args/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-31T19:02:39.339 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:171: TEST_reweight: run_osd td/crush-choose-args 0 2026-03-31T19:02:39.339 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-choose-args 2026-03-31T19:02:39.339 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:02:39.339 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=0 2026-03-31T19:02:39.339 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:02:39.340 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-choose-args/0 2026-03-31T19:02:39.340 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false ' 2026-03-31T19:02:39.340 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:02:39.340 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:02:39.340 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:02:39.340 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-choose-args/0' 2026-03-31T19:02:39.340 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-choose-args/0/journal' 2026-03-31T19:02:39.340 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:02:39.340 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:02:39.340 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-choose-args' 2026-03-31T19:02:39.340 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:02:39.340 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:39.340 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:39.340 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:39.340 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:39.340 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:39.340 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:39.341 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:39.341 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:02:39.341 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:02:39.341 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:02:39.341 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-choose-args/$name.log' 2026-03-31T19:02:39.341 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-choose-args/$name.pid' 2026-03-31T19:02:39.341 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:02:39.341 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:02:39.341 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:02:39.341 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:02:39.341 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:02:39.341 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:02:39.341 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-choose-args/0 2026-03-31T19:02:39.343 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:02:39.345 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=a79d61aa-5431-4322-b1b6-323eed0ffac1 2026-03-31T19:02:39.345 INFO:tasks.workunit.client.0.vm05.stdout:add osd0 a79d61aa-5431-4322-b1b6-323eed0ffac1 2026-03-31T19:02:39.345 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd0 a79d61aa-5431-4322-b1b6-323eed0ffac1' 2026-03-31T19:02:39.345 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:02:39.359 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQBPGsxpqzFaFRAA6OhEJa5AEQoft9GKzljbeg== 2026-03-31T19:02:39.359 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQBPGsxpqzFaFRAA6OhEJa5AEQoft9GKzljbeg=="}' 2026-03-31T19:02:39.359 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new a79d61aa-5431-4322-b1b6-323eed0ffac1 -i td/crush-choose-args/0/new.json 2026-03-31T19:02:39.477 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:02:39.484 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-choose-args/0/new.json 2026-03-31T19:02:39.485 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 0 --fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/0 --osd-journal=td/crush-choose-args/0/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBPGsxpqzFaFRAA6OhEJa5AEQoft9GKzljbeg== --osd-uuid a79d61aa-5431-4322-b1b6-323eed0ffac1 2026-03-31T19:02:39.507 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:39.506+0000 7f8e8ca0e900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:39.513 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:39.510+0000 7f8e8ca0e900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:39.514 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:39.513+0000 7f8e8ca0e900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:39.514 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:39.513+0000 7f8e8ca0e900 -1 bdev(0x5583d269cc00 td/crush-choose-args/0/block) open stat got: (1) Operation not permitted 2026-03-31T19:02:39.514 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:39.513+0000 7f8e8ca0e900 -1 bluestore(td/crush-choose-args/0) _read_fsid unparsable uuid 2026-03-31T19:02:39.948 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-choose-args/0/keyring 2026-03-31T19:02:39.948 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:02:39.949 INFO:tasks.workunit.client.0.vm05.stdout:adding osd0 key to auth repository 2026-03-31T19:02:39.949 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd0 key to auth repository 2026-03-31T19:02:39.949 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-choose-args/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:02:40.064 INFO:tasks.workunit.client.0.vm05.stdout:start osd.0 2026-03-31T19:02:40.064 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.0 2026-03-31T19:02:40.064 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 0 --fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/0 --osd-journal=td/crush-choose-args/0/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:02:40.065 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:02:40.065 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:02:40.066 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:02:40.084 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:40.083+0000 7f6aabf18900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:40.086 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:40.085+0000 7f6aabf18900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:40.087 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:40.086+0000 7f6aabf18900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:40.176 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 0 2026-03-31T19:02:40.176 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:02:40.176 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=0 2026-03-31T19:02:40.176 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:02:40.176 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:02:40.176 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:40.176 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:02:40.176 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:02:40.176 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:40.176 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.0 up' 2026-03-31T19:02:40.190 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:40.189+0000 7f6aabf18900 -1 Falling back to public interface 2026-03-31T19:02:40.285 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:02:40.305 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:40.304+0000 7f6aabf18900 -1 osd.0 0 log_to_monitors true 2026-03-31T19:02:41.286 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:02:41.286 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:02:41.286 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:41.286 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:02:41.286 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:41.286 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.0 up' 2026-03-31T19:02:41.399 INFO:tasks.workunit.client.0.vm05.stdout:osd.0 up in weight 1 up_from 4 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6800/4052857496,v1:127.0.0.1:6801/4052857496] [v2:127.0.0.1:6802/4052857496,v1:127.0.0.1:6803/4052857496] exists,up a79d61aa-5431-4322-b1b6-323eed0ffac1 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:172: TEST_reweight: run_osd td/crush-choose-args 1 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-choose-args 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=1 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-choose-args/1 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false ' 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-choose-args/1' 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-choose-args/1/journal' 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-choose-args' 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:41.400 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:41.401 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:41.401 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:41.401 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:41.401 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:41.401 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:02:41.401 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:02:41.401 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:02:41.401 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-choose-args/$name.log' 2026-03-31T19:02:41.401 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-choose-args/$name.pid' 2026-03-31T19:02:41.401 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:02:41.401 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:02:41.401 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:02:41.401 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:02:41.401 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:02:41.401 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:02:41.401 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-choose-args/1 2026-03-31T19:02:41.401 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:02:41.402 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=c321bd8e-825a-4feb-b367-2613cd6f5289 2026-03-31T19:02:41.402 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd1 c321bd8e-825a-4feb-b367-2613cd6f5289' 2026-03-31T19:02:41.402 INFO:tasks.workunit.client.0.vm05.stdout:add osd1 c321bd8e-825a-4feb-b367-2613cd6f5289 2026-03-31T19:02:41.402 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:02:41.416 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQBRGsxp193AGBAAwSxACkkVKvG5jpU9OJ85ig== 2026-03-31T19:02:41.416 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQBRGsxp193AGBAAwSxACkkVKvG5jpU9OJ85ig=="}' 2026-03-31T19:02:41.416 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new c321bd8e-825a-4feb-b367-2613cd6f5289 -i td/crush-choose-args/1/new.json 2026-03-31T19:02:41.654 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:02:41.662 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-choose-args/1/new.json 2026-03-31T19:02:41.663 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 1 --fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/1 --osd-journal=td/crush-choose-args/1/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBRGsxp193AGBAAwSxACkkVKvG5jpU9OJ85ig== --osd-uuid c321bd8e-825a-4feb-b367-2613cd6f5289 2026-03-31T19:02:41.683 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:41.681+0000 7f3e661cb900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:41.684 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:41.683+0000 7f3e661cb900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:41.685 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:41.684+0000 7f3e661cb900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:41.685 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:41.684+0000 7f3e661cb900 -1 bdev(0x561f26778c00 td/crush-choose-args/1/block) open stat got: (1) Operation not permitted 2026-03-31T19:02:41.685 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:41.684+0000 7f3e661cb900 -1 bluestore(td/crush-choose-args/1) _read_fsid unparsable uuid 2026-03-31T19:02:42.136 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-choose-args/1/keyring 2026-03-31T19:02:42.136 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:02:42.137 INFO:tasks.workunit.client.0.vm05.stdout:adding osd1 key to auth repository 2026-03-31T19:02:42.137 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd1 key to auth repository 2026-03-31T19:02:42.137 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-choose-args/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:02:42.457 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.1 2026-03-31T19:02:42.457 INFO:tasks.workunit.client.0.vm05.stdout:start osd.1 2026-03-31T19:02:42.457 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 1 --fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/1 --osd-journal=td/crush-choose-args/1/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:02:42.457 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:02:42.457 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:02:42.457 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:02:42.476 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:42.475+0000 7fbe0faf1900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:42.478 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:42.476+0000 7fbe0faf1900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:42.479 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:42.477+0000 7fbe0faf1900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:42.612 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:42.610+0000 7fbe0faf1900 -1 Falling back to public interface 2026-03-31T19:02:42.696 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 1 2026-03-31T19:02:42.696 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:02:42.696 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=1 2026-03-31T19:02:42.696 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:02:42.696 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:02:42.696 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:42.696 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:02:42.696 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:02:42.696 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:42.696 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.1 up' 2026-03-31T19:02:42.753 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:42.751+0000 7fbe0faf1900 -1 osd.1 0 log_to_monitors true 2026-03-31T19:02:42.944 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:02:43.945 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:02:43.945 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:02:43.945 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:43.945 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:02:43.945 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:43.945 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.1 up' 2026-03-31T19:02:44.182 INFO:tasks.workunit.client.0.vm05.stdout:osd.1 up in weight 1 up_from 7 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/1494581805,v1:127.0.0.1:6811/1494581805] [v2:127.0.0.1:6812/1494581805,v1:127.0.0.1:6813/1494581805] exists,up c321bd8e-825a-4feb-b367-2613cd6f5289 2026-03-31T19:02:44.182 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:02:44.182 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:02:44.182 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:02:44.182 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:174: TEST_reweight: ceph osd crush weight-set create-compat 2026-03-31T19:02:44.436 INFO:tasks.workunit.client.0.vm05.stderr:compat weight-set already created 2026-03-31T19:02:44.446 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:175: TEST_reweight: ceph osd crush tree 2026-03-31T19:02:44.681 INFO:tasks.workunit.client.0.vm05.stdout:ID CLASS WEIGHT (compat) TYPE NAME 2026-03-31T19:02:44.681 INFO:tasks.workunit.client.0.vm05.stdout:-1 6.00000 root default 2026-03-31T19:02:44.681 INFO:tasks.workunit.client.0.vm05.stdout:-2 6.00000 6.00000 host HOST 2026-03-31T19:02:44.681 INFO:tasks.workunit.client.0.vm05.stdout: 0 3.00000 3.00000 osd.0 2026-03-31T19:02:44.681 INFO:tasks.workunit.client.0.vm05.stdout: 1 3.00000 3.00000 osd.1 2026-03-31T19:02:44.690 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:177: TEST_reweight: ceph osd crush weight-set reweight-compat osd.0 2 2026-03-31T19:02:44.981 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:178: TEST_reweight: ceph osd crush tree 2026-03-31T19:02:45.217 INFO:tasks.workunit.client.0.vm05.stdout:ID CLASS WEIGHT (compat) TYPE NAME 2026-03-31T19:02:45.217 INFO:tasks.workunit.client.0.vm05.stdout:-1 6.00000 root default 2026-03-31T19:02:45.217 INFO:tasks.workunit.client.0.vm05.stdout:-2 6.00000 5.00000 host HOST 2026-03-31T19:02:45.217 INFO:tasks.workunit.client.0.vm05.stdout: 0 3.00000 2.00000 osd.0 2026-03-31T19:02:45.217 INFO:tasks.workunit.client.0.vm05.stdout: 1 3.00000 3.00000 osd.1 2026-03-31T19:02:45.225 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:179: TEST_reweight: ceph osd crush tree 2026-03-31T19:02:45.225 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:179: TEST_reweight: grep host 2026-03-31T19:02:45.226 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:179: TEST_reweight: grep '6.00000 5.00000' 2026-03-31T19:02:45.470 INFO:tasks.workunit.client.0.vm05.stdout:-2 6.00000 5.00000 host HOST 2026-03-31T19:02:45.470 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:181: TEST_reweight: run_osd td/crush-choose-args 2 2026-03-31T19:02:45.470 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-choose-args 2026-03-31T19:02:45.470 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:02:45.470 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=2 2026-03-31T19:02:45.470 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:02:45.470 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-choose-args/2 2026-03-31T19:02:45.470 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false ' 2026-03-31T19:02:45.470 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:02:45.470 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:02:45.470 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:02:45.470 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-choose-args/2' 2026-03-31T19:02:45.470 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-choose-args/2/journal' 2026-03-31T19:02:45.470 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:02:45.470 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:02:45.470 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-choose-args' 2026-03-31T19:02:45.470 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:02:45.471 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:45.471 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:45.471 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:45.471 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:45.471 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:45.471 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:45.471 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' 2026-03-31T19:02:45.471 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:02:45.471 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:02:45.471 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:02:45.471 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-choose-args/$name.log' 2026-03-31T19:02:45.471 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-choose-args/$name.pid' 2026-03-31T19:02:45.471 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:02:45.471 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:02:45.471 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:02:45.471 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:02:45.471 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:02:45.471 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:02:45.471 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-choose-args/2 2026-03-31T19:02:45.472 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:02:45.473 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=175fd43f-e9ad-4bce-8ea0-26a9fd655b92 2026-03-31T19:02:45.473 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd2 175fd43f-e9ad-4bce-8ea0-26a9fd655b92' 2026-03-31T19:02:45.473 INFO:tasks.workunit.client.0.vm05.stdout:add osd2 175fd43f-e9ad-4bce-8ea0-26a9fd655b92 2026-03-31T19:02:45.473 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:02:45.486 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQBVGsxpu0P1HBAAUVntwRe9vLNGy3J8AJYlvw== 2026-03-31T19:02:45.486 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQBVGsxpu0P1HBAAUVntwRe9vLNGy3J8AJYlvw=="}' 2026-03-31T19:02:45.486 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new 175fd43f-e9ad-4bce-8ea0-26a9fd655b92 -i td/crush-choose-args/2/new.json 2026-03-31T19:02:45.725 INFO:tasks.workunit.client.0.vm05.stdout:2 2026-03-31T19:02:45.735 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-choose-args/2/new.json 2026-03-31T19:02:45.736 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 2 --fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/2 --osd-journal=td/crush-choose-args/2/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBVGsxpu0P1HBAAUVntwRe9vLNGy3J8AJYlvw== --osd-uuid 175fd43f-e9ad-4bce-8ea0-26a9fd655b92 2026-03-31T19:02:45.757 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:45.755+0000 7fca5da20900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:45.758 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:45.757+0000 7fca5da20900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:45.759 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:45.758+0000 7fca5da20900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:45.759 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:45.758+0000 7fca5da20900 -1 bdev(0x55dd4808cc00 td/crush-choose-args/2/block) open stat got: (1) Operation not permitted 2026-03-31T19:02:45.759 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:45.758+0000 7fca5da20900 -1 bluestore(td/crush-choose-args/2) _read_fsid unparsable uuid 2026-03-31T19:02:46.210 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-choose-args/2/keyring 2026-03-31T19:02:46.210 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:02:46.211 INFO:tasks.workunit.client.0.vm05.stdout:adding osd2 key to auth repository 2026-03-31T19:02:46.211 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd2 key to auth repository 2026-03-31T19:02:46.211 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-choose-args/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:02:46.545 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.2 2026-03-31T19:02:46.545 INFO:tasks.workunit.client.0.vm05.stdout:start osd.2 2026-03-31T19:02:46.545 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 2 --fsid=ed255af3-f6e2-4662-a5cf-df0bcdedb8dc --auth-supported=none --mon-host=127.0.0.1:7131 --crush-location=root=default,host=HOST --osd-crush-initial-weight=3 --osd-class-update-on-start=false --osd-crush-update-weight-set=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-choose-args/2 --osd-journal=td/crush-choose-args/2/journal --chdir= --run-dir=td/crush-choose-args '--admin-socket=/tmp/ceph-asok.51199/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-choose-args/$name.log' '--pid-file=td/crush-choose-args/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:02:46.545 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:02:46.545 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:02:46.545 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:02:46.564 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:46.563+0000 7f5d09657900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:46.566 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:46.564+0000 7f5d09657900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:46.567 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:46.565+0000 7f5d09657900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:46.697 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:46.695+0000 7f5d09657900 -1 Falling back to public interface 2026-03-31T19:02:46.783 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 2 2026-03-31T19:02:46.784 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:02:46.784 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=2 2026-03-31T19:02:46.784 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:02:46.784 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:02:46.784 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:46.784 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:02:46.784 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:02:46.784 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:46.784 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.2 up' 2026-03-31T19:02:46.844 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:46.843+0000 7f5d09657900 -1 osd.2 0 log_to_monitors true 2026-03-31T19:02:47.030 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:02:48.031 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:02:48.031 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:02:48.031 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:48.031 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:02:48.031 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:48.031 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.2 up' 2026-03-31T19:02:48.270 INFO:tasks.workunit.client.0.vm05.stdout:osd.2 up in weight 1 up_from 13 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/1417914319,v1:127.0.0.1:6819/1417914319] [v2:127.0.0.1:6820/1417914319,v1:127.0.0.1:6821/1417914319] exists,up 175fd43f-e9ad-4bce-8ea0-26a9fd655b92 2026-03-31T19:02:48.270 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:02:48.270 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:02:48.271 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:02:48.271 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:182: TEST_reweight: ceph osd crush tree 2026-03-31T19:02:48.517 INFO:tasks.workunit.client.0.vm05.stdout:ID CLASS WEIGHT (compat) TYPE NAME 2026-03-31T19:02:48.517 INFO:tasks.workunit.client.0.vm05.stdout:-1 9.00000 root default 2026-03-31T19:02:48.517 INFO:tasks.workunit.client.0.vm05.stdout:-2 9.00000 5.00000 host HOST 2026-03-31T19:02:48.517 INFO:tasks.workunit.client.0.vm05.stdout: 0 3.00000 2.00000 osd.0 2026-03-31T19:02:48.517 INFO:tasks.workunit.client.0.vm05.stdout: 1 3.00000 3.00000 osd.1 2026-03-31T19:02:48.517 INFO:tasks.workunit.client.0.vm05.stdout: 2 3.00000 0 osd.2 2026-03-31T19:02:48.527 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:183: TEST_reweight: ceph osd crush tree 2026-03-31T19:02:48.527 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:183: TEST_reweight: grep host 2026-03-31T19:02:48.527 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:183: TEST_reweight: grep '9.00000 5.00000' 2026-03-31T19:02:48.773 INFO:tasks.workunit.client.0.vm05.stdout:-2 9.00000 5.00000 host HOST 2026-03-31T19:02:48.773 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:185: TEST_reweight: ceph osd crush reweight osd.2 4 2026-03-31T19:02:49.097 INFO:tasks.workunit.client.0.vm05.stderr:reweighted item id 2 name 'osd.2' to 4 in crush map 2026-03-31T19:02:49.108 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:186: TEST_reweight: ceph osd crush tree 2026-03-31T19:02:49.351 INFO:tasks.workunit.client.0.vm05.stdout:ID CLASS WEIGHT (compat) TYPE NAME 2026-03-31T19:02:49.351 INFO:tasks.workunit.client.0.vm05.stdout:-1 10.00000 root default 2026-03-31T19:02:49.351 INFO:tasks.workunit.client.0.vm05.stdout:-2 10.00000 5.00000 host HOST 2026-03-31T19:02:49.351 INFO:tasks.workunit.client.0.vm05.stdout: 0 3.00000 2.00000 osd.0 2026-03-31T19:02:49.351 INFO:tasks.workunit.client.0.vm05.stdout: 1 3.00000 3.00000 osd.1 2026-03-31T19:02:49.351 INFO:tasks.workunit.client.0.vm05.stdout: 2 4.00000 0 osd.2 2026-03-31T19:02:49.361 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:187: TEST_reweight: ceph osd crush tree 2026-03-31T19:02:49.361 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:187: TEST_reweight: grep host 2026-03-31T19:02:49.361 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:187: TEST_reweight: grep '10.00000 5.00000' 2026-03-31T19:02:49.606 INFO:tasks.workunit.client.0.vm05.stdout:-2 10.00000 5.00000 host HOST 2026-03-31T19:02:49.606 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:189: TEST_reweight: ceph osd crush weight-set reweight-compat osd.2 4 2026-03-31T19:02:49.900 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:190: TEST_reweight: ceph osd crush tree 2026-03-31T19:02:50.138 INFO:tasks.workunit.client.0.vm05.stdout:ID CLASS WEIGHT (compat) TYPE NAME 2026-03-31T19:02:50.138 INFO:tasks.workunit.client.0.vm05.stdout:-1 10.00000 root default 2026-03-31T19:02:50.138 INFO:tasks.workunit.client.0.vm05.stdout:-2 10.00000 9.00000 host HOST 2026-03-31T19:02:50.138 INFO:tasks.workunit.client.0.vm05.stdout: 0 3.00000 2.00000 osd.0 2026-03-31T19:02:50.138 INFO:tasks.workunit.client.0.vm05.stdout: 1 3.00000 3.00000 osd.1 2026-03-31T19:02:50.138 INFO:tasks.workunit.client.0.vm05.stdout: 2 4.00000 4.00000 osd.2 2026-03-31T19:02:50.148 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:191: TEST_reweight: ceph osd crush tree 2026-03-31T19:02:50.148 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:191: TEST_reweight: grep host 2026-03-31T19:02:50.148 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:191: TEST_reweight: grep '10.00000 9.00000' 2026-03-31T19:02:50.398 INFO:tasks.workunit.client.0.vm05.stdout:-2 10.00000 9.00000 host HOST 2026-03-31T19:02:50.399 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-choose-args.sh:41: run: teardown td/crush-choose-args 2026-03-31T19:02:50.399 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:155: teardown: local dir=td/crush-choose-args 2026-03-31T19:02:50.399 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:156: teardown: local dumplogs= 2026-03-31T19:02:50.399 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:157: teardown: kill_daemons td/crush-choose-args KILL 2026-03-31T19:02:50.399 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:02:50.399 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:02:50.399 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:02:50.399 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:02:50.399 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:02:50.510 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:02:50.510 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: uname 2026-03-31T19:02:50.511 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-31T19:02:50.511 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: stat -f -c %T . 2026-03-31T19:02:50.512 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: '[' xfs == btrfs ']' 2026-03-31T19:02:50.512 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:162: teardown: local cores=no 2026-03-31T19:02:50.512 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: sysctl -n kernel.core_pattern 2026-03-31T19:02:50.513 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:50.513 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: '[' / = '|' ']' 2026-03-31T19:02:50.514 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: grep -q '^core\|core$' 2026-03-31T19:02:50.514 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:50.514 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:02:50.515 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: '[' no = yes -o '' = 1 ']' 2026-03-31T19:02:50.515 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: rm -fr td/crush-choose-args 2026-03-31T19:02:50.525 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: get_asok_dir 2026-03-31T19:02:50.525 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:50.525 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:50.525 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: rm -rf /tmp/ceph-asok.51199 2026-03-31T19:02:50.526 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:191: teardown: '[' no = yes ']' 2026-03-31T19:02:50.526 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: return 0 2026-03-31T19:02:50.526 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2405: main: code=0 2026-03-31T19:02:50.526 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2409: main: teardown td/crush-choose-args 0 2026-03-31T19:02:50.526 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:155: teardown: local dir=td/crush-choose-args 2026-03-31T19:02:50.526 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:156: teardown: local dumplogs=0 2026-03-31T19:02:50.526 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:157: teardown: kill_daemons td/crush-choose-args KILL 2026-03-31T19:02:50.527 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:02:50.527 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:02:50.527 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:02:50.527 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:02:50.527 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:02:50.529 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:02:50.529 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: uname 2026-03-31T19:02:50.530 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-31T19:02:50.530 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: stat -f -c %T . 2026-03-31T19:02:50.531 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: '[' xfs == btrfs ']' 2026-03-31T19:02:50.531 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:162: teardown: local cores=no 2026-03-31T19:02:50.531 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: sysctl -n kernel.core_pattern 2026-03-31T19:02:50.532 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:50.532 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: '[' / = '|' ']' 2026-03-31T19:02:50.532 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: grep -q '^core\|core$' 2026-03-31T19:02:50.532 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:50.533 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:02:50.534 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: '[' no = yes -o 0 = 1 ']' 2026-03-31T19:02:50.534 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: rm -fr td/crush-choose-args 2026-03-31T19:02:50.535 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: get_asok_dir 2026-03-31T19:02:50.535 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:50.535 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.51199 2026-03-31T19:02:50.535 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: rm -rf /tmp/ceph-asok.51199 2026-03-31T19:02:50.536 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:191: teardown: '[' no = yes ']' 2026-03-31T19:02:50.536 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: return 0 2026-03-31T19:02:50.536 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2410: main: return 0 2026-03-31T19:02:50.536 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-31T19:02:50.537 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-31T19:02:50.600 INFO:tasks.workunit:Running workunit crush/crush-classes.sh... 2026-03-31T19:02:50.600 DEBUG:teuthology.orchestra.run.vm05:workunit test crush/crush-classes.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=0392f78529848ec72469e8e431875cb98d3a5fb4 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh 2026-03-31T19:02:50.661 INFO:tasks.workunit.client.0.vm05.stderr:stty: 'standard input': Inappropriate ioctl for device 2026-03-31T19:02:50.665 INFO:tasks.workunit.client.0.vm05.stderr:+ PS4='${BASH_SOURCE[0]}:$LINENO: ${FUNCNAME[0]}: ' 2026-03-31T19:02:50.665 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2398: main: export PATH=.:/home/ubuntu/.local/bin:/home/ubuntu/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/sbin 2026-03-31T19:02:50.665 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2398: main: PATH=.:/home/ubuntu/.local/bin:/home/ubuntu/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/sbin 2026-03-31T19:02:50.665 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2399: main: export PYTHONWARNINGS=ignore 2026-03-31T19:02:50.665 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2399: main: PYTHONWARNINGS=ignore 2026-03-31T19:02:50.665 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2400: main: export CEPH_CONF=/dev/null 2026-03-31T19:02:50.665 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2400: main: CEPH_CONF=/dev/null 2026-03-31T19:02:50.665 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2401: main: unset CEPH_ARGS 2026-03-31T19:02:50.665 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2403: main: local code 2026-03-31T19:02:50.665 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2404: main: run td/crush-classes 2026-03-31T19:02:50.665 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:21: run: local dir=td/crush-classes 2026-03-31T19:02:50.665 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:22: run: shift 2026-03-31T19:02:50.665 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:24: run: export CEPH_MON=127.0.0.1:7130 2026-03-31T19:02:50.665 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:24: run: CEPH_MON=127.0.0.1:7130 2026-03-31T19:02:50.665 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:25: run: export CEPH_ARGS 2026-03-31T19:02:50.665 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:26: run: uuidgen 2026-03-31T19:02:50.665 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:26: run: CEPH_ARGS+='--fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none ' 2026-03-31T19:02:50.665 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:27: run: CEPH_ARGS+='--mon-host=127.0.0.1:7130 ' 2026-03-31T19:02:50.665 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:31: run: CEPH_ARGS+='--osd-class-update-on-start=false ' 2026-03-31T19:02:50.666 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:33: run: set 2026-03-31T19:02:50.666 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:33: run: sed -n -e 's/^\(TEST_[0-9a-z_]*\) .*/\1/p' 2026-03-31T19:02:50.667 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:33: run: local 'funcs=TEST_classes 2026-03-31T19:02:50.667 INFO:tasks.workunit.client.0.vm05.stderr:TEST_mon_classes 2026-03-31T19:02:50.668 INFO:tasks.workunit.client.0.vm05.stderr:TEST_reweight_vs_classes 2026-03-31T19:02:50.668 INFO:tasks.workunit.client.0.vm05.stderr:TEST_set_device_class' 2026-03-31T19:02:50.668 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:34: run: for func in $funcs 2026-03-31T19:02:50.668 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:35: run: setup td/crush-classes 2026-03-31T19:02:50.668 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:122: setup: local dir=td/crush-classes 2026-03-31T19:02:50.668 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:123: setup: teardown td/crush-classes 2026-03-31T19:02:50.668 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:155: teardown: local dir=td/crush-classes 2026-03-31T19:02:50.668 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:156: teardown: local dumplogs= 2026-03-31T19:02:50.668 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:157: teardown: kill_daemons td/crush-classes KILL 2026-03-31T19:02:50.668 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:02:50.668 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:02:50.668 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:02:50.668 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:02:50.668 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:02:50.670 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:02:50.670 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: uname 2026-03-31T19:02:50.671 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-31T19:02:50.671 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: stat -f -c %T . 2026-03-31T19:02:50.671 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: '[' xfs == btrfs ']' 2026-03-31T19:02:50.672 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:162: teardown: local cores=no 2026-03-31T19:02:50.672 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: sysctl -n kernel.core_pattern 2026-03-31T19:02:50.673 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:50.673 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: '[' / = '|' ']' 2026-03-31T19:02:50.673 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: grep -q '^core\|core$' 2026-03-31T19:02:50.673 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:50.674 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:02:50.675 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: '[' no = yes -o '' = 1 ']' 2026-03-31T19:02:50.675 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: rm -fr td/crush-classes 2026-03-31T19:02:50.676 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: get_asok_dir 2026-03-31T19:02:50.676 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:50.676 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:02:50.676 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: rm -rf /tmp/ceph-asok.65997 2026-03-31T19:02:50.676 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:191: teardown: '[' no = yes ']' 2026-03-31T19:02:50.677 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: return 0 2026-03-31T19:02:50.677 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:124: setup: mkdir -p td/crush-classes 2026-03-31T19:02:50.678 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:125: setup: get_asok_dir 2026-03-31T19:02:50.678 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:50.678 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:02:50.678 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:125: setup: mkdir -p /tmp/ceph-asok.65997 2026-03-31T19:02:50.680 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:126: setup: ulimit -n 2026-03-31T19:02:50.680 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:126: setup: '[' 1024 -le 1024 ']' 2026-03-31T19:02:50.680 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:127: setup: ulimit -n 4096 2026-03-31T19:02:50.680 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:129: setup: '[' -z '' ']' 2026-03-31T19:02:50.680 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:130: setup: trap 'teardown td/crush-classes 1' TERM HUP INT 2026-03-31T19:02:50.680 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:36: run: TEST_classes td/crush-classes 2026-03-31T19:02:50.680 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:89: TEST_classes: local dir=td/crush-classes 2026-03-31T19:02:50.680 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:91: TEST_classes: run_mon td/crush-classes a 2026-03-31T19:02:50.680 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:439: run_mon: local dir=td/crush-classes 2026-03-31T19:02:50.680 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:440: run_mon: shift 2026-03-31T19:02:50.680 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:441: run_mon: local id=a 2026-03-31T19:02:50.680 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:442: run_mon: shift 2026-03-31T19:02:50.680 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:443: run_mon: local data=td/crush-classes/a 2026-03-31T19:02:50.680 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:446: run_mon: ceph-mon --id a --mkfs --mon-data=td/crush-classes/a --run-dir=td/crush-classes 2026-03-31T19:02:50.712 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:453: run_mon: get_asok_path 2026-03-31T19:02:50.712 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:50.712 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:50.712 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:50.712 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:50.712 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:02:50.713 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:02:50.713 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:453: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/crush-classes/a '--log-file=td/crush-classes/$name.log' '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --mon-cluster-log-file=td/crush-classes/log --run-dir=td/crush-classes '--pid-file=td/crush-classes/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-31T19:02:50.744 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: cat 2026-03-31T19:02:50.745 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: get_config mon a fsid 2026-03-31T19:02:50.745 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1119: get_config: local daemon=mon 2026-03-31T19:02:50.745 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1120: get_config: local id=a 2026-03-31T19:02:50.745 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1121: get_config: local config=fsid 2026-03-31T19:02:50.745 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: get_asok_path mon.a 2026-03-31T19:02:50.745 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name=mon.a 2026-03-31T19:02:50.745 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n mon.a ']' 2026-03-31T19:02:50.745 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: get_asok_dir 2026-03-31T19:02:50.745 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:50.745 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:02:50.746 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: echo /tmp/ceph-asok.65997/ceph-mon.a.asok 2026-03-31T19:02:50.746 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: local daemon_asok=/tmp/ceph-asok.65997/ceph-mon.a.asok 2026-03-31T19:02:50.746 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: CEPH_ARGS= 2026-03-31T19:02:50.746 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: ceph --format json daemon /tmp/ceph-asok.65997/ceph-mon.a.asok config get fsid 2026-03-31T19:02:50.746 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: jq -r .fsid 2026-03-31T19:02:50.800 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: get_config mon a mon_host 2026-03-31T19:02:50.800 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1119: get_config: local daemon=mon 2026-03-31T19:02:50.800 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1120: get_config: local id=a 2026-03-31T19:02:50.800 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1121: get_config: local config=mon_host 2026-03-31T19:02:50.801 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: get_asok_path mon.a 2026-03-31T19:02:50.801 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name=mon.a 2026-03-31T19:02:50.801 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n mon.a ']' 2026-03-31T19:02:50.801 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: get_asok_dir 2026-03-31T19:02:50.801 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:50.801 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:02:50.801 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: echo /tmp/ceph-asok.65997/ceph-mon.a.asok 2026-03-31T19:02:50.801 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: local daemon_asok=/tmp/ceph-asok.65997/ceph-mon.a.asok 2026-03-31T19:02:50.801 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: CEPH_ARGS= 2026-03-31T19:02:50.801 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: ceph --format json daemon /tmp/ceph-asok.65997/ceph-mon.a.asok config get mon_host 2026-03-31T19:02:50.802 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: jq -r .mon_host 2026-03-31T19:02:50.849 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:92: TEST_classes: run_osd td/crush-classes 0 2026-03-31T19:02:50.849 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-classes 2026-03-31T19:02:50.849 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:02:50.849 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=0 2026-03-31T19:02:50.849 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:02:50.849 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-classes/0 2026-03-31T19:02:50.849 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-31T19:02:50.849 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:02:50.849 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:02:50.849 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-classes/0' 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-classes/0/journal' 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:02:50.850 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-classes/0 2026-03-31T19:02:50.851 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:02:50.852 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=0d9e3fc7-467a-4fa1-84ab-97a0b5a4582b 2026-03-31T19:02:50.852 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd0 0d9e3fc7-467a-4fa1-84ab-97a0b5a4582b' 2026-03-31T19:02:50.852 INFO:tasks.workunit.client.0.vm05.stdout:add osd0 0d9e3fc7-467a-4fa1-84ab-97a0b5a4582b 2026-03-31T19:02:50.853 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:02:50.865 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQBaGsxpIz2JMxAAxiGBv7vBC4Vuo44tjAwVxg== 2026-03-31T19:02:50.865 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQBaGsxpIz2JMxAAxiGBv7vBC4Vuo44tjAwVxg=="}' 2026-03-31T19:02:50.865 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new 0d9e3fc7-467a-4fa1-84ab-97a0b5a4582b -i td/crush-classes/0/new.json 2026-03-31T19:02:50.979 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:02:50.987 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-classes/0/new.json 2026-03-31T19:02:50.987 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 0 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/0 --osd-journal=td/crush-classes/0/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBaGsxpIz2JMxAAxiGBv7vBC4Vuo44tjAwVxg== --osd-uuid 0d9e3fc7-467a-4fa1-84ab-97a0b5a4582b 2026-03-31T19:02:51.006 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:51.005+0000 7f17af259900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:51.008 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:51.007+0000 7f17af259900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:51.009 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:51.008+0000 7f17af259900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:51.009 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:51.008+0000 7f17af259900 -1 bdev(0x562d61930c00 td/crush-classes/0/block) open stat got: (1) Operation not permitted 2026-03-31T19:02:51.010 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:51.008+0000 7f17af259900 -1 bluestore(td/crush-classes/0) _read_fsid unparsable uuid 2026-03-31T19:02:51.464 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-classes/0/keyring 2026-03-31T19:02:51.464 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:02:51.465 INFO:tasks.workunit.client.0.vm05.stdout:adding osd0 key to auth repository 2026-03-31T19:02:51.465 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd0 key to auth repository 2026-03-31T19:02:51.465 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-classes/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:02:51.576 INFO:tasks.workunit.client.0.vm05.stdout:start osd.0 2026-03-31T19:02:51.576 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.0 2026-03-31T19:02:51.576 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 0 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/0 --osd-journal=td/crush-classes/0/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:02:51.576 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:02:51.576 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:02:51.576 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:02:51.593 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:51.592+0000 7f005c257900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:51.595 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:51.593+0000 7f005c257900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:51.596 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:51.594+0000 7f005c257900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:51.690 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 0 2026-03-31T19:02:51.690 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:02:51.690 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=0 2026-03-31T19:02:51.690 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:02:51.690 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:02:51.690 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:02:51.690 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:51.690 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:02:51.690 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:51.690 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.0 up' 2026-03-31T19:02:51.708 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:51.707+0000 7f005c257900 -1 Falling back to public interface 2026-03-31T19:02:51.797 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:02:51.864 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:51.863+0000 7f005c257900 -1 osd.0 0 log_to_monitors true 2026-03-31T19:02:52.799 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:02:52.799 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:02:52.799 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:52.799 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:02:52.799 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:52.799 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.0 up' 2026-03-31T19:02:52.908 INFO:tasks.workunit.client.0.vm05.stdout:osd.0 up in weight 1 up_from 4 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6800/155743392,v1:127.0.0.1:6801/155743392] [v2:127.0.0.1:6802/155743392,v1:127.0.0.1:6803/155743392] exists,up 0d9e3fc7-467a-4fa1-84ab-97a0b5a4582b 2026-03-31T19:02:52.908 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:02:52.908 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:02:52.908 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:02:52.908 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:93: TEST_classes: run_osd td/crush-classes 1 2026-03-31T19:02:52.908 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-classes 2026-03-31T19:02:52.908 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:02:52.908 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=1 2026-03-31T19:02:52.908 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:02:52.908 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-classes/1 2026-03-31T19:02:52.908 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-31T19:02:52.908 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:02:52.908 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:02:52.908 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:02:52.908 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-classes/1' 2026-03-31T19:02:52.908 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-classes/1/journal' 2026-03-31T19:02:52.908 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:02:52.908 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:02:52.908 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-31T19:02:52.909 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:02:52.909 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:52.909 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:52.909 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:52.909 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:52.909 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:02:52.909 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:02:52.909 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:02:52.909 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:02:52.909 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:02:52.909 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:02:52.909 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-31T19:02:52.909 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-31T19:02:52.909 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:02:52.909 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:02:52.909 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:02:52.909 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:02:52.909 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:02:52.909 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:02:52.909 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-classes/1 2026-03-31T19:02:52.910 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:02:52.911 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=04fc0c7f-d4fd-458d-9441-b08d089c1096 2026-03-31T19:02:52.911 INFO:tasks.workunit.client.0.vm05.stdout:add osd1 04fc0c7f-d4fd-458d-9441-b08d089c1096 2026-03-31T19:02:52.911 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd1 04fc0c7f-d4fd-458d-9441-b08d089c1096' 2026-03-31T19:02:52.911 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:02:52.924 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQBcGsxpMHIENxAAkQrZ270BIrUvYJDX/If6Sw== 2026-03-31T19:02:52.924 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQBcGsxpMHIENxAAkQrZ270BIrUvYJDX/If6Sw=="}' 2026-03-31T19:02:52.924 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new 04fc0c7f-d4fd-458d-9441-b08d089c1096 -i td/crush-classes/1/new.json 2026-03-31T19:02:53.038 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:02:53.047 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-classes/1/new.json 2026-03-31T19:02:53.048 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 1 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/1 --osd-journal=td/crush-classes/1/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBcGsxpMHIENxAAkQrZ270BIrUvYJDX/If6Sw== --osd-uuid 04fc0c7f-d4fd-458d-9441-b08d089c1096 2026-03-31T19:02:53.066 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:53.065+0000 7fd441f70900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:53.068 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:53.067+0000 7fd441f70900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:53.069 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:53.068+0000 7fd441f70900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:53.069 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:53.068+0000 7fd441f70900 -1 bdev(0x55a4a56b6c00 td/crush-classes/1/block) open stat got: (1) Operation not permitted 2026-03-31T19:02:53.069 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:53.068+0000 7fd441f70900 -1 bluestore(td/crush-classes/1) _read_fsid unparsable uuid 2026-03-31T19:02:53.482 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-classes/1/keyring 2026-03-31T19:02:53.483 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:02:53.483 INFO:tasks.workunit.client.0.vm05.stdout:adding osd1 key to auth repository 2026-03-31T19:02:53.483 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd1 key to auth repository 2026-03-31T19:02:53.483 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-classes/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:02:53.592 INFO:tasks.workunit.client.0.vm05.stdout:start osd.1 2026-03-31T19:02:53.592 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.1 2026-03-31T19:02:53.592 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 1 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/1 --osd-journal=td/crush-classes/1/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:02:53.592 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:02:53.593 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:02:53.593 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:02:53.611 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:53.610+0000 7f2a7009f900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:53.613 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:53.612+0000 7f2a7009f900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:53.615 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:53.613+0000 7f2a7009f900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:53.698 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 1 2026-03-31T19:02:53.699 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:02:53.699 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=1 2026-03-31T19:02:53.699 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:02:53.699 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:02:53.699 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:53.699 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:02:53.699 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:02:53.699 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:53.699 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.1 up' 2026-03-31T19:02:53.731 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:53.730+0000 7f2a7009f900 -1 Falling back to public interface 2026-03-31T19:02:53.808 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:02:53.878 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:53.877+0000 7f2a7009f900 -1 osd.1 0 log_to_monitors true 2026-03-31T19:02:54.810 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:02:54.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:02:54.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:54.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:02:54.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:54.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.1 up' 2026-03-31T19:02:54.910 INFO:tasks.workunit.client.0.vm05.stdout:osd.1 up in weight 1 up_from 7 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6808/1554300821,v1:127.0.0.1:6809/1554300821] [v2:127.0.0.1:6810/1554300821,v1:127.0.0.1:6811/1554300821] exists,up 04fc0c7f-d4fd-458d-9441-b08d089c1096 2026-03-31T19:02:54.910 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:02:54.910 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:02:54.910 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:02:54.910 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:94: TEST_classes: run_osd td/crush-classes 2 2026-03-31T19:02:54.910 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-classes 2026-03-31T19:02:54.910 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:02:54.910 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=2 2026-03-31T19:02:54.910 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:02:54.910 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-classes/2 2026-03-31T19:02:54.910 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-31T19:02:54.910 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:02:54.910 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:02:54.910 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:02:54.910 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-classes/2' 2026-03-31T19:02:54.910 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-classes/2/journal' 2026-03-31T19:02:54.910 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:02:54.910 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:02:54.910 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-31T19:02:54.911 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:02:54.911 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:54.911 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:54.911 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:54.911 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:54.911 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:02:54.911 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:02:54.911 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:02:54.911 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:02:54.911 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:02:54.911 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:02:54.911 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-31T19:02:54.911 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-31T19:02:54.911 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:02:54.911 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:02:54.912 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:02:54.912 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:02:54.912 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:02:54.912 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:02:54.912 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-classes/2 2026-03-31T19:02:54.913 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:02:54.913 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=e977df73-178a-42f9-b5ab-953fa723269a 2026-03-31T19:02:54.913 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd2 e977df73-178a-42f9-b5ab-953fa723269a' 2026-03-31T19:02:54.913 INFO:tasks.workunit.client.0.vm05.stdout:add osd2 e977df73-178a-42f9-b5ab-953fa723269a 2026-03-31T19:02:54.914 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:02:54.926 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQBeGsxpTWArNxAAwD+xgvWGGpUB3rwKAUejSw== 2026-03-31T19:02:54.926 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQBeGsxpTWArNxAAwD+xgvWGGpUB3rwKAUejSw=="}' 2026-03-31T19:02:54.926 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new e977df73-178a-42f9-b5ab-953fa723269a -i td/crush-classes/2/new.json 2026-03-31T19:02:55.032 INFO:tasks.workunit.client.0.vm05.stdout:2 2026-03-31T19:02:55.039 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-classes/2/new.json 2026-03-31T19:02:55.039 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 2 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/2 --osd-journal=td/crush-classes/2/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBeGsxpTWArNxAAwD+xgvWGGpUB3rwKAUejSw== --osd-uuid e977df73-178a-42f9-b5ab-953fa723269a 2026-03-31T19:02:55.057 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:55.056+0000 7f61ca2e6900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:55.059 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:55.058+0000 7f61ca2e6900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:55.060 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:55.059+0000 7f61ca2e6900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:55.061 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:55.059+0000 7f61ca2e6900 -1 bdev(0x55ae11384c00 td/crush-classes/2/block) open stat got: (1) Operation not permitted 2026-03-31T19:02:55.061 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:55.059+0000 7f61ca2e6900 -1 bluestore(td/crush-classes/2) _read_fsid unparsable uuid 2026-03-31T19:02:55.470 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-classes/2/keyring 2026-03-31T19:02:55.470 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:02:55.471 INFO:tasks.workunit.client.0.vm05.stdout:adding osd2 key to auth repository 2026-03-31T19:02:55.471 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd2 key to auth repository 2026-03-31T19:02:55.471 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-classes/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:02:55.582 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.2 2026-03-31T19:02:55.582 INFO:tasks.workunit.client.0.vm05.stdout:start osd.2 2026-03-31T19:02:55.583 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 2 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/2 --osd-journal=td/crush-classes/2/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:02:55.583 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:02:55.583 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:02:55.583 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:02:55.600 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:55.599+0000 7f8a13a72900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:55.602 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:55.601+0000 7f8a13a72900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:55.603 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:55.602+0000 7f8a13a72900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:55.691 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 2 2026-03-31T19:02:55.691 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:02:55.691 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=2 2026-03-31T19:02:55.691 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:02:55.691 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:02:55.691 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:55.691 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:02:55.691 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:02:55.691 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:55.691 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.2 up' 2026-03-31T19:02:55.764 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:55.763+0000 7f8a13a72900 -1 Falling back to public interface 2026-03-31T19:02:55.796 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:02:55.993 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:55.991+0000 7f8a13a72900 -1 osd.2 0 log_to_monitors true 2026-03-31T19:02:56.798 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:02:56.798 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:02:56.798 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:02:56.798 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:02:56.798 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:02:56.798 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.2 up' 2026-03-31T19:02:56.905 INFO:tasks.workunit.client.0.vm05.stdout:osd.2 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6816/1404970604,v1:127.0.0.1:6817/1404970604] [v2:127.0.0.1:6818/1404970604,v1:127.0.0.1:6819/1404970604] exists,up e977df73-178a-42f9-b5ab-953fa723269a 2026-03-31T19:02:56.905 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:02:56.905 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:02:56.905 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:02:56.905 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:95: TEST_classes: create_rbd_pool 2026-03-31T19:02:56.905 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:527: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-31T19:02:57.008 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd' does not exist 2026-03-31T19:02:57.015 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:528: create_rbd_pool: create_pool rbd 4 2026-03-31T19:02:57.015 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:533: create_pool: ceph osd pool create rbd 4 2026-03-31T19:02:57.163 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd' already exists 2026-03-31T19:02:57.170 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:534: create_pool: sleep 1 2026-03-31T19:02:58.171 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:529: create_rbd_pool: rbd pool init rbd 2026-03-31T19:02:58.447 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:97: TEST_classes: get_osds_up rbd SOMETHING 2026-03-31T19:02:58.447 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:51: get_osds_up: local poolname=rbd 2026-03-31T19:02:58.447 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:52: get_osds_up: local objectname=SOMETHING 2026-03-31T19:02:58.447 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: ceph --format xml osd map rbd SOMETHING 2026-03-31T19:02:58.447 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: xmlstarlet sel -t -m //up/osd -v . -o ' ' 2026-03-31T19:02:58.549 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: local 'osds=1 2 0 ' 2026-03-31T19:02:58.549 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:57: get_osds_up: echo 1 2 0 2026-03-31T19:02:58.550 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:97: TEST_classes: test '1 2 0' == '1 2 0' 2026-03-31T19:02:58.550 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:98: TEST_classes: add_something td/crush-classes SOMETHING 2026-03-31T19:02:58.550 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:42: add_something: local dir=td/crush-classes 2026-03-31T19:02:58.550 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:43: add_something: local obj=SOMETHING 2026-03-31T19:02:58.550 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:45: add_something: local payload=ABCDEF 2026-03-31T19:02:58.550 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:46: add_something: echo ABCDEF 2026-03-31T19:02:58.554 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:47: add_something: rados --pool rbd put SOMETHING td/crush-classes/ORIGINAL 2026-03-31T19:02:58.571 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:104: TEST_classes: ceph osd getcrushmap 2026-03-31T19:02:58.665 INFO:tasks.workunit.client.0.vm05.stderr:4 2026-03-31T19:02:58.672 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:105: TEST_classes: crushtool -d td/crush-classes/map -o td/crush-classes/map.txt 2026-03-31T19:02:58.683 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:106: TEST_classes: sed -i -e '/device 0 osd.0/s/$/ class ssd/' -e '/step take default/s/$/ class ssd/' td/crush-classes/map.txt 2026-03-31T19:02:58.684 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:110: TEST_classes: crushtool -c td/crush-classes/map.txt -o td/crush-classes/map-new 2026-03-31T19:02:58.694 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:111: TEST_classes: ceph osd setcrushmap -i td/crush-classes/map-new 2026-03-31T19:02:58.950 INFO:tasks.workunit.client.0.vm05.stderr:6 2026-03-31T19:02:58.960 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:117: TEST_classes: ok=false 2026-03-31T19:02:58.960 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:118: TEST_classes: for delay in 2 4 8 16 32 64 128 256 2026-03-31T19:02:58.961 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:119: TEST_classes: get_osds_up rbd SOMETHING_ELSE 2026-03-31T19:02:58.961 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:51: get_osds_up: local poolname=rbd 2026-03-31T19:02:58.961 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:52: get_osds_up: local objectname=SOMETHING_ELSE 2026-03-31T19:02:58.961 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: ceph --format xml osd map rbd SOMETHING_ELSE 2026-03-31T19:02:58.961 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: xmlstarlet sel -t -m //up/osd -v . -o ' ' 2026-03-31T19:02:59.070 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: local 'osds=0 ' 2026-03-31T19:02:59.070 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:57: get_osds_up: echo 0 2026-03-31T19:02:59.070 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:119: TEST_classes: test 0 == 0 2026-03-31T19:02:59.070 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:120: TEST_classes: ok=true 2026-03-31T19:02:59.070 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:121: TEST_classes: break 2026-03-31T19:02:59.070 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:127: TEST_classes: true 2026-03-31T19:02:59.071 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:132: TEST_classes: add_something td/crush-classes SOMETHING_ELSE 2026-03-31T19:02:59.071 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:42: add_something: local dir=td/crush-classes 2026-03-31T19:02:59.071 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:43: add_something: local obj=SOMETHING_ELSE 2026-03-31T19:02:59.071 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:45: add_something: local payload=ABCDEF 2026-03-31T19:02:59.071 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:46: add_something: echo ABCDEF 2026-03-31T19:02:59.071 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:47: add_something: rados --pool rbd put SOMETHING_ELSE td/crush-classes/ORIGINAL 2026-03-31T19:02:59.092 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:138: TEST_classes: ceph osd crush dump 2026-03-31T19:02:59.092 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:138: TEST_classes: grep -q '~ssd' 2026-03-31T19:02:59.193 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:37: run: teardown td/crush-classes 2026-03-31T19:02:59.193 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:155: teardown: local dir=td/crush-classes 2026-03-31T19:02:59.193 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:156: teardown: local dumplogs= 2026-03-31T19:02:59.193 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:157: teardown: kill_daemons td/crush-classes KILL 2026-03-31T19:02:59.193 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:02:59.193 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:02:59.193 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:02:59.194 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:02:59.194 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:02:59.301 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:02:59.301 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: uname 2026-03-31T19:02:59.302 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-31T19:02:59.302 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: stat -f -c %T . 2026-03-31T19:02:59.303 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: '[' xfs == btrfs ']' 2026-03-31T19:02:59.303 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:162: teardown: local cores=no 2026-03-31T19:02:59.303 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: sysctl -n kernel.core_pattern 2026-03-31T19:02:59.304 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:59.304 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: '[' / = '|' ']' 2026-03-31T19:02:59.304 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: grep -q '^core\|core$' 2026-03-31T19:02:59.304 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:59.305 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:02:59.306 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: '[' no = yes -o '' = 1 ']' 2026-03-31T19:02:59.306 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: rm -fr td/crush-classes 2026-03-31T19:02:59.315 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: get_asok_dir 2026-03-31T19:02:59.315 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:59.315 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:02:59.316 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: rm -rf /tmp/ceph-asok.65997 2026-03-31T19:02:59.316 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:191: teardown: '[' no = yes ']' 2026-03-31T19:02:59.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: return 0 2026-03-31T19:02:59.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:34: run: for func in $funcs 2026-03-31T19:02:59.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:35: run: setup td/crush-classes 2026-03-31T19:02:59.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:122: setup: local dir=td/crush-classes 2026-03-31T19:02:59.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:123: setup: teardown td/crush-classes 2026-03-31T19:02:59.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:155: teardown: local dir=td/crush-classes 2026-03-31T19:02:59.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:156: teardown: local dumplogs= 2026-03-31T19:02:59.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:157: teardown: kill_daemons td/crush-classes KILL 2026-03-31T19:02:59.317 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:02:59.317 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:02:59.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:02:59.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:02:59.317 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:02:59.318 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:02:59.319 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: uname 2026-03-31T19:02:59.319 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-31T19:02:59.319 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: stat -f -c %T . 2026-03-31T19:02:59.320 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: '[' xfs == btrfs ']' 2026-03-31T19:02:59.320 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:162: teardown: local cores=no 2026-03-31T19:02:59.320 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: sysctl -n kernel.core_pattern 2026-03-31T19:02:59.321 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:59.321 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: '[' / = '|' ']' 2026-03-31T19:02:59.321 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: grep -q '^core\|core$' 2026-03-31T19:02:59.322 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:02:59.322 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:02:59.323 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: '[' no = yes -o '' = 1 ']' 2026-03-31T19:02:59.323 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: rm -fr td/crush-classes 2026-03-31T19:02:59.324 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: get_asok_dir 2026-03-31T19:02:59.324 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:59.324 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:02:59.324 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: rm -rf /tmp/ceph-asok.65997 2026-03-31T19:02:59.325 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:191: teardown: '[' no = yes ']' 2026-03-31T19:02:59.325 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: return 0 2026-03-31T19:02:59.325 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:124: setup: mkdir -p td/crush-classes 2026-03-31T19:02:59.326 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:125: setup: get_asok_dir 2026-03-31T19:02:59.326 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:59.326 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:02:59.326 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:125: setup: mkdir -p /tmp/ceph-asok.65997 2026-03-31T19:02:59.327 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:126: setup: ulimit -n 2026-03-31T19:02:59.327 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:126: setup: '[' 4096 -le 1024 ']' 2026-03-31T19:02:59.327 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:129: setup: '[' -z '' ']' 2026-03-31T19:02:59.327 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:130: setup: trap 'teardown td/crush-classes 1' TERM HUP INT 2026-03-31T19:02:59.327 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:36: run: TEST_mon_classes td/crush-classes 2026-03-31T19:02:59.327 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:167: TEST_mon_classes: local dir=td/crush-classes 2026-03-31T19:02:59.327 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:169: TEST_mon_classes: run_mon td/crush-classes a 2026-03-31T19:02:59.327 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:439: run_mon: local dir=td/crush-classes 2026-03-31T19:02:59.327 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:440: run_mon: shift 2026-03-31T19:02:59.327 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:441: run_mon: local id=a 2026-03-31T19:02:59.327 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:442: run_mon: shift 2026-03-31T19:02:59.327 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:443: run_mon: local data=td/crush-classes/a 2026-03-31T19:02:59.327 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:446: run_mon: ceph-mon --id a --mkfs --mon-data=td/crush-classes/a --run-dir=td/crush-classes 2026-03-31T19:02:59.351 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:453: run_mon: get_asok_path 2026-03-31T19:02:59.351 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:59.351 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:59.351 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:59.351 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:59.351 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:02:59.351 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:02:59.351 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:453: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/crush-classes/a '--log-file=td/crush-classes/$name.log' '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --mon-cluster-log-file=td/crush-classes/log --run-dir=td/crush-classes '--pid-file=td/crush-classes/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-31T19:02:59.379 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: cat 2026-03-31T19:02:59.379 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: get_config mon a fsid 2026-03-31T19:02:59.379 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1119: get_config: local daemon=mon 2026-03-31T19:02:59.379 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1120: get_config: local id=a 2026-03-31T19:02:59.379 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1121: get_config: local config=fsid 2026-03-31T19:02:59.380 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: get_asok_path mon.a 2026-03-31T19:02:59.380 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name=mon.a 2026-03-31T19:02:59.380 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n mon.a ']' 2026-03-31T19:02:59.380 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: get_asok_dir 2026-03-31T19:02:59.380 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:59.380 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:02:59.380 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: echo /tmp/ceph-asok.65997/ceph-mon.a.asok 2026-03-31T19:02:59.380 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: local daemon_asok=/tmp/ceph-asok.65997/ceph-mon.a.asok 2026-03-31T19:02:59.381 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: CEPH_ARGS= 2026-03-31T19:02:59.381 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: ceph --format json daemon /tmp/ceph-asok.65997/ceph-mon.a.asok config get fsid 2026-03-31T19:02:59.381 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: jq -r .fsid 2026-03-31T19:02:59.427 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: get_config mon a mon_host 2026-03-31T19:02:59.427 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1119: get_config: local daemon=mon 2026-03-31T19:02:59.427 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1120: get_config: local id=a 2026-03-31T19:02:59.427 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1121: get_config: local config=mon_host 2026-03-31T19:02:59.427 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: get_asok_path mon.a 2026-03-31T19:02:59.427 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name=mon.a 2026-03-31T19:02:59.427 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n mon.a ']' 2026-03-31T19:02:59.427 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: get_asok_dir 2026-03-31T19:02:59.428 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:59.428 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:02:59.428 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: echo /tmp/ceph-asok.65997/ceph-mon.a.asok 2026-03-31T19:02:59.428 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: local daemon_asok=/tmp/ceph-asok.65997/ceph-mon.a.asok 2026-03-31T19:02:59.428 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: CEPH_ARGS= 2026-03-31T19:02:59.428 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: ceph --format json daemon /tmp/ceph-asok.65997/ceph-mon.a.asok config get mon_host 2026-03-31T19:02:59.428 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: jq -r .mon_host 2026-03-31T19:02:59.473 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:170: TEST_mon_classes: run_osd td/crush-classes 0 2026-03-31T19:02:59.473 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-classes 2026-03-31T19:02:59.473 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:02:59.473 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=0 2026-03-31T19:02:59.473 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:02:59.473 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-classes/0 2026-03-31T19:02:59.473 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-31T19:02:59.473 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:02:59.473 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:02:59.473 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:02:59.473 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-classes/0' 2026-03-31T19:02:59.473 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-classes/0/journal' 2026-03-31T19:02:59.473 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:02:59.474 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-classes/0 2026-03-31T19:02:59.475 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:02:59.476 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=9353b597-8ed4-44fd-8e13-0a4a06e03bfb 2026-03-31T19:02:59.476 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd0 9353b597-8ed4-44fd-8e13-0a4a06e03bfb' 2026-03-31T19:02:59.476 INFO:tasks.workunit.client.0.vm05.stdout:add osd0 9353b597-8ed4-44fd-8e13-0a4a06e03bfb 2026-03-31T19:02:59.476 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:02:59.488 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQBjGsxpUZ4OHRAAxyzubY3IQJq+SGqmEJFbeA== 2026-03-31T19:02:59.488 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQBjGsxpUZ4OHRAAxyzubY3IQJq+SGqmEJFbeA=="}' 2026-03-31T19:02:59.488 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new 9353b597-8ed4-44fd-8e13-0a4a06e03bfb -i td/crush-classes/0/new.json 2026-03-31T19:02:59.597 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:02:59.603 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-classes/0/new.json 2026-03-31T19:02:59.604 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 0 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/0 --osd-journal=td/crush-classes/0/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBjGsxpUZ4OHRAAxyzubY3IQJq+SGqmEJFbeA== --osd-uuid 9353b597-8ed4-44fd-8e13-0a4a06e03bfb 2026-03-31T19:02:59.622 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:59.621+0000 7fe0b1967900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:59.624 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:59.623+0000 7fe0b1967900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:59.625 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:59.624+0000 7fe0b1967900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:02:59.625 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:59.624+0000 7fe0b1967900 -1 bdev(0x5580ca840c00 td/crush-classes/0/block) open stat got: (1) Operation not permitted 2026-03-31T19:02:59.625 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:02:59.624+0000 7fe0b1967900 -1 bluestore(td/crush-classes/0) _read_fsid unparsable uuid 2026-03-31T19:03:00.039 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-classes/0/keyring 2026-03-31T19:03:00.039 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:03:00.040 INFO:tasks.workunit.client.0.vm05.stdout:adding osd0 key to auth repository 2026-03-31T19:03:00.040 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd0 key to auth repository 2026-03-31T19:03:00.040 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-classes/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:03:00.149 INFO:tasks.workunit.client.0.vm05.stdout:start osd.0 2026-03-31T19:03:00.149 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.0 2026-03-31T19:03:00.149 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 0 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/0 --osd-journal=td/crush-classes/0/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:03:00.149 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:03:00.149 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:03:00.149 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:03:00.166 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:00.165+0000 7f78fa6f7900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:00.168 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:00.167+0000 7f78fa6f7900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:00.169 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:00.168+0000 7f78fa6f7900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:00.254 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 0 2026-03-31T19:03:00.255 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:03:00.255 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=0 2026-03-31T19:03:00.255 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:03:00.255 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:03:00.255 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:03:00.255 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:03:00.255 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:03:00.255 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:03:00.255 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.0 up' 2026-03-31T19:03:00.321 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:00.319+0000 7f78fa6f7900 -1 Falling back to public interface 2026-03-31T19:03:00.358 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:03:00.441 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:00.439+0000 7f78fa6f7900 -1 osd.0 0 log_to_monitors true 2026-03-31T19:03:01.359 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:03:01.359 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:03:01.359 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:03:01.359 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:03:01.359 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:03:01.359 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.0 up' 2026-03-31T19:03:01.461 INFO:tasks.workunit.client.0.vm05.stdout:osd.0 up in weight 1 up_from 4 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6800/1448955009,v1:127.0.0.1:6801/1448955009] [v2:127.0.0.1:6802/1448955009,v1:127.0.0.1:6803/1448955009] exists,up 9353b597-8ed4-44fd-8e13-0a4a06e03bfb 2026-03-31T19:03:01.461 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:03:01.461 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:03:01.461 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:03:01.461 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:171: TEST_mon_classes: run_osd td/crush-classes 1 2026-03-31T19:03:01.461 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-classes 2026-03-31T19:03:01.461 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:03:01.461 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=1 2026-03-31T19:03:01.461 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:03:01.462 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-classes/1 2026-03-31T19:03:01.462 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-31T19:03:01.462 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:03:01.462 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:03:01.462 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:03:01.462 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-classes/1' 2026-03-31T19:03:01.462 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-classes/1/journal' 2026-03-31T19:03:01.462 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:03:01.462 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:03:01.462 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-31T19:03:01.462 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:03:01.462 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:03:01.462 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:03:01.462 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:03:01.462 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:01.462 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:01.462 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:03:01.463 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:03:01.463 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:03:01.463 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:03:01.463 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:03:01.463 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-31T19:03:01.463 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-31T19:03:01.463 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:03:01.463 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:03:01.463 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:03:01.463 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:03:01.463 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:03:01.463 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:03:01.463 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-classes/1 2026-03-31T19:03:01.464 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:03:01.464 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=be044121-f3d2-49bf-8209-ab00b6ceecd1 2026-03-31T19:03:01.465 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd1 be044121-f3d2-49bf-8209-ab00b6ceecd1' 2026-03-31T19:03:01.465 INFO:tasks.workunit.client.0.vm05.stdout:add osd1 be044121-f3d2-49bf-8209-ab00b6ceecd1 2026-03-31T19:03:01.465 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:03:01.477 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQBlGsxpiEpkHBAAhCi8yQOvYcEo2lTFEDlrKg== 2026-03-31T19:03:01.477 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQBlGsxpiEpkHBAAhCi8yQOvYcEo2lTFEDlrKg=="}' 2026-03-31T19:03:01.477 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new be044121-f3d2-49bf-8209-ab00b6ceecd1 -i td/crush-classes/1/new.json 2026-03-31T19:03:01.584 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:03:01.592 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-classes/1/new.json 2026-03-31T19:03:01.592 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 1 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/1 --osd-journal=td/crush-classes/1/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBlGsxpiEpkHBAAhCi8yQOvYcEo2lTFEDlrKg== --osd-uuid be044121-f3d2-49bf-8209-ab00b6ceecd1 2026-03-31T19:03:01.611 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:01.609+0000 7f826bb25900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:01.612 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:01.611+0000 7f826bb25900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:01.613 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:01.612+0000 7f826bb25900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:01.613 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:01.612+0000 7f826bb25900 -1 bdev(0x562f77e14c00 td/crush-classes/1/block) open stat got: (1) Operation not permitted 2026-03-31T19:03:01.613 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:01.612+0000 7f826bb25900 -1 bluestore(td/crush-classes/1) _read_fsid unparsable uuid 2026-03-31T19:03:02.044 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-classes/1/keyring 2026-03-31T19:03:02.044 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:03:02.045 INFO:tasks.workunit.client.0.vm05.stdout:adding osd1 key to auth repository 2026-03-31T19:03:02.045 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd1 key to auth repository 2026-03-31T19:03:02.045 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-classes/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:03:02.149 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.1 2026-03-31T19:03:02.150 INFO:tasks.workunit.client.0.vm05.stdout:start osd.1 2026-03-31T19:03:02.150 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 1 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/1 --osd-journal=td/crush-classes/1/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:03:02.150 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:03:02.150 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:03:02.150 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:03:02.167 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:02.166+0000 7f864cdda900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:02.169 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:02.167+0000 7f864cdda900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:02.169 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:02.168+0000 7f864cdda900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:02.251 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 1 2026-03-31T19:03:02.251 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:03:02.251 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=1 2026-03-31T19:03:02.251 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:03:02.251 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:03:02.251 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:03:02.251 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:03:02.251 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:03:02.251 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:03:02.251 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.1 up' 2026-03-31T19:03:02.294 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:02.292+0000 7f864cdda900 -1 Falling back to public interface 2026-03-31T19:03:02.354 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:03:02.475 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:02.473+0000 7f864cdda900 -1 osd.1 0 log_to_monitors true 2026-03-31T19:03:03.166 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:03.165+0000 7f8648d7a640 -1 osd.1 0 waiting for initial osdmap 2026-03-31T19:03:03.355 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:03:03.356 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:03:03.356 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:03:03.356 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:03:03.356 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:03:03.356 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.1 up' 2026-03-31T19:03:03.458 INFO:tasks.workunit.client.0.vm05.stdout:osd.1 up in weight 1 up_from 7 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6808/1969859100,v1:127.0.0.1:6809/1969859100] [v2:127.0.0.1:6810/1969859100,v1:127.0.0.1:6811/1969859100] exists,up be044121-f3d2-49bf-8209-ab00b6ceecd1 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:172: TEST_mon_classes: run_osd td/crush-classes 2 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-classes 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=2 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-classes/2 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-classes/2' 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-classes/2/journal' 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:03.459 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:03:03.460 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:03:03.460 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:03:03.460 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:03:03.460 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:03:03.460 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-31T19:03:03.460 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-31T19:03:03.460 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:03:03.460 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:03:03.460 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:03:03.460 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:03:03.460 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:03:03.460 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:03:03.460 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-classes/2 2026-03-31T19:03:03.461 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:03:03.462 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=41d5f47e-7561-4a48-9e26-59c0c2fdd555 2026-03-31T19:03:03.462 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd2 41d5f47e-7561-4a48-9e26-59c0c2fdd555' 2026-03-31T19:03:03.462 INFO:tasks.workunit.client.0.vm05.stdout:add osd2 41d5f47e-7561-4a48-9e26-59c0c2fdd555 2026-03-31T19:03:03.462 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:03:03.475 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQBnGsxpp1Y/HBAAJEiW78U9c4Sryi2GrGDBqQ== 2026-03-31T19:03:03.475 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQBnGsxpp1Y/HBAAJEiW78U9c4Sryi2GrGDBqQ=="}' 2026-03-31T19:03:03.475 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new 41d5f47e-7561-4a48-9e26-59c0c2fdd555 -i td/crush-classes/2/new.json 2026-03-31T19:03:03.583 INFO:tasks.workunit.client.0.vm05.stdout:2 2026-03-31T19:03:03.590 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-classes/2/new.json 2026-03-31T19:03:03.590 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 2 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/2 --osd-journal=td/crush-classes/2/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBnGsxpp1Y/HBAAJEiW78U9c4Sryi2GrGDBqQ== --osd-uuid 41d5f47e-7561-4a48-9e26-59c0c2fdd555 2026-03-31T19:03:03.609 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:03.607+0000 7f8cc77da900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:03.610 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:03.609+0000 7f8cc77da900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:03.611 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:03.610+0000 7f8cc77da900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:03.611 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:03.610+0000 7f8cc77da900 -1 bdev(0x562725c7cc00 td/crush-classes/2/block) open stat got: (1) Operation not permitted 2026-03-31T19:03:03.611 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:03.610+0000 7f8cc77da900 -1 bluestore(td/crush-classes/2) _read_fsid unparsable uuid 2026-03-31T19:03:04.037 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-classes/2/keyring 2026-03-31T19:03:04.037 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:03:04.038 INFO:tasks.workunit.client.0.vm05.stdout:adding osd2 key to auth repository 2026-03-31T19:03:04.038 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd2 key to auth repository 2026-03-31T19:03:04.038 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-classes/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:03:04.145 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.2 2026-03-31T19:03:04.145 INFO:tasks.workunit.client.0.vm05.stdout:start osd.2 2026-03-31T19:03:04.145 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 2 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/2 --osd-journal=td/crush-classes/2/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:03:04.145 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:03:04.145 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:03:04.145 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:03:04.163 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:04.161+0000 7f25cc585900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:04.164 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:04.163+0000 7f25cc585900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:04.165 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:04.164+0000 7f25cc585900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:04.249 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 2 2026-03-31T19:03:04.249 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:03:04.249 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=2 2026-03-31T19:03:04.249 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:03:04.249 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:03:04.249 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:03:04.249 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:03:04.250 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:03:04.250 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:03:04.250 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.2 up' 2026-03-31T19:03:04.339 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:04.337+0000 7f25cc585900 -1 Falling back to public interface 2026-03-31T19:03:04.354 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:03:04.508 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:04.507+0000 7f25cc585900 -1 osd.2 0 log_to_monitors true 2026-03-31T19:03:05.355 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:03:05.356 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:03:05.356 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:03:05.356 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:03:05.356 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:03:05.356 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.2 up' 2026-03-31T19:03:05.468 INFO:tasks.workunit.client.0.vm05.stdout:osd.2 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6816/2873406306,v1:127.0.0.1:6817/2873406306] [v2:127.0.0.1:6818/2873406306,v1:127.0.0.1:6819/2873406306] exists,up 41d5f47e-7561-4a48-9e26-59c0c2fdd555 2026-03-31T19:03:05.468 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:03:05.468 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:03:05.468 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:03:05.468 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:173: TEST_mon_classes: create_rbd_pool 2026-03-31T19:03:05.468 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:527: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-31T19:03:05.565 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd' does not exist 2026-03-31T19:03:05.572 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:528: create_rbd_pool: create_pool rbd 4 2026-03-31T19:03:05.572 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:533: create_pool: ceph osd pool create rbd 4 2026-03-31T19:03:05.717 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd' already exists 2026-03-31T19:03:05.724 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:534: create_pool: sleep 1 2026-03-31T19:03:06.725 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:529: create_rbd_pool: rbd pool init rbd 2026-03-31T19:03:07.013 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:175: TEST_mon_classes: get_osds_up rbd SOMETHING 2026-03-31T19:03:07.013 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:51: get_osds_up: local poolname=rbd 2026-03-31T19:03:07.013 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:52: get_osds_up: local objectname=SOMETHING 2026-03-31T19:03:07.013 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: ceph --format xml osd map rbd SOMETHING 2026-03-31T19:03:07.013 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: xmlstarlet sel -t -m //up/osd -v . -o ' ' 2026-03-31T19:03:07.131 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: local 'osds=1 2 0 ' 2026-03-31T19:03:07.131 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:57: get_osds_up: echo 1 2 0 2026-03-31T19:03:07.131 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:175: TEST_mon_classes: test '1 2 0' == '1 2 0' 2026-03-31T19:03:07.131 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:176: TEST_mon_classes: add_something td/crush-classes SOMETHING 2026-03-31T19:03:07.131 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:42: add_something: local dir=td/crush-classes 2026-03-31T19:03:07.131 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:43: add_something: local obj=SOMETHING 2026-03-31T19:03:07.131 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:45: add_something: local payload=ABCDEF 2026-03-31T19:03:07.131 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:46: add_something: echo ABCDEF 2026-03-31T19:03:07.131 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:47: add_something: rados --pool rbd put SOMETHING td/crush-classes/ORIGINAL 2026-03-31T19:03:07.156 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:179: TEST_mon_classes: ceph osd crush class create CLASS 2026-03-31T19:03:07.333 INFO:tasks.workunit.client.0.vm05.stderr:class 'CLASS' already exists 2026-03-31T19:03:07.341 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:180: TEST_mon_classes: ceph osd crush class create CLASS 2026-03-31T19:03:07.444 INFO:tasks.workunit.client.0.vm05.stderr:class 'CLASS' already exists 2026-03-31T19:03:07.451 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:181: TEST_mon_classes: ceph osd crush class ls 2026-03-31T19:03:07.451 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:181: TEST_mon_classes: grep CLASS 2026-03-31T19:03:07.557 INFO:tasks.workunit.client.0.vm05.stdout: "CLASS" 2026-03-31T19:03:07.557 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:182: TEST_mon_classes: ceph osd crush class rename CLASS TEMP 2026-03-31T19:03:07.738 INFO:tasks.workunit.client.0.vm05.stderr:already renamed to 'TEMP' 2026-03-31T19:03:07.745 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:183: TEST_mon_classes: ceph osd crush class ls 2026-03-31T19:03:07.745 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:183: TEST_mon_classes: grep TEMP 2026-03-31T19:03:07.852 INFO:tasks.workunit.client.0.vm05.stdout: "TEMP" 2026-03-31T19:03:07.852 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:184: TEST_mon_classes: ceph osd crush class rename TEMP CLASS 2026-03-31T19:03:08.043 INFO:tasks.workunit.client.0.vm05.stderr:already renamed to 'CLASS' 2026-03-31T19:03:08.050 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:185: TEST_mon_classes: ceph osd crush class ls 2026-03-31T19:03:08.051 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:185: TEST_mon_classes: grep CLASS 2026-03-31T19:03:08.157 INFO:tasks.workunit.client.0.vm05.stdout: "CLASS" 2026-03-31T19:03:08.157 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:186: TEST_mon_classes: ceph osd erasure-code-profile set myprofile plugin=jerasure technique=reed_sol_van k=2 m=1 crush-failure-domain=osd crush-device-class=CLASS 2026-03-31T19:03:08.354 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:187: TEST_mon_classes: expect_failure td/crush-classes EBUSY ceph osd crush class rm CLASS 2026-03-31T19:03:08.354 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2021: expect_failure: local dir=td/crush-classes 2026-03-31T19:03:08.354 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2022: expect_failure: shift 2026-03-31T19:03:08.354 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2023: expect_failure: local expected=EBUSY 2026-03-31T19:03:08.354 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2024: expect_failure: shift 2026-03-31T19:03:08.354 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2025: expect_failure: local success 2026-03-31T19:03:08.354 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2027: expect_failure: ceph osd crush class rm CLASS 2026-03-31T19:03:08.434 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2030: expect_failure: success=false 2026-03-31T19:03:08.434 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2033: expect_failure: false 2026-03-31T19:03:08.434 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2033: expect_failure: grep --quiet EBUSY td/crush-classes/out 2026-03-31T19:03:08.435 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2037: expect_failure: return 0 2026-03-31T19:03:08.435 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:188: TEST_mon_classes: ceph osd erasure-code-profile rm myprofile 2026-03-31T19:03:08.550 INFO:tasks.workunit.client.0.vm05.stderr:erasure-code-profile myprofile does not exist 2026-03-31T19:03:08.557 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:189: TEST_mon_classes: ceph osd crush class rm CLASS 2026-03-31T19:03:08.761 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:190: TEST_mon_classes: ceph osd crush class rm CLASS 2026-03-31T19:03:08.867 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:193: TEST_mon_classes: ceph osd crush set-device-class aaa osd.0 2026-03-31T19:03:09.140 INFO:tasks.workunit.client.0.vm05.stderr:osd.0 already set to class aaa. set-device-class item id 0 name 'osd.0' device_class 'aaa': no change. set osd(s) to class 'aaa' 2026-03-31T19:03:09.148 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:194: TEST_mon_classes: ceph osd tree 2026-03-31T19:03:09.148 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:194: TEST_mon_classes: grep -q aaa 2026-03-31T19:03:09.253 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:195: TEST_mon_classes: ceph osd crush dump 2026-03-31T19:03:09.253 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:195: TEST_mon_classes: grep -q '~aaa' 2026-03-31T19:03:09.356 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:196: TEST_mon_classes: ceph osd crush tree --show-shadow 2026-03-31T19:03:09.356 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:196: TEST_mon_classes: grep -q '~aaa' 2026-03-31T19:03:09.463 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:197: TEST_mon_classes: ceph osd crush set-device-class bbb osd.1 2026-03-31T19:03:09.652 INFO:tasks.workunit.client.0.vm05.stderr:osd.1 already set to class bbb. set-device-class item id 1 name 'osd.1' device_class 'bbb': no change. set osd(s) to class 'bbb' 2026-03-31T19:03:09.660 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:198: TEST_mon_classes: ceph osd tree 2026-03-31T19:03:09.660 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:198: TEST_mon_classes: grep -q bbb 2026-03-31T19:03:09.765 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:199: TEST_mon_classes: grep -q '~bbb' 2026-03-31T19:03:09.765 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:199: TEST_mon_classes: ceph osd crush dump 2026-03-31T19:03:09.866 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:200: TEST_mon_classes: ceph osd crush tree --show-shadow 2026-03-31T19:03:09.867 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:200: TEST_mon_classes: grep -q '~bbb' 2026-03-31T19:03:09.973 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:201: TEST_mon_classes: ceph osd crush set-device-class ccc osd.2 2026-03-31T19:03:10.158 INFO:tasks.workunit.client.0.vm05.stderr:osd.2 already set to class ccc. set-device-class item id 2 name 'osd.2' device_class 'ccc': no change. set osd(s) to class 'ccc' 2026-03-31T19:03:10.166 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:202: TEST_mon_classes: ceph osd tree 2026-03-31T19:03:10.166 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:202: TEST_mon_classes: grep -q ccc 2026-03-31T19:03:10.271 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:203: TEST_mon_classes: ceph osd crush dump 2026-03-31T19:03:10.271 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:203: TEST_mon_classes: grep -q '~ccc' 2026-03-31T19:03:10.376 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:204: TEST_mon_classes: ceph osd crush tree --show-shadow 2026-03-31T19:03:10.376 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:204: TEST_mon_classes: grep -q '~ccc' 2026-03-31T19:03:10.481 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:205: TEST_mon_classes: ceph osd crush rm-device-class 0 2026-03-31T19:03:10.699 INFO:tasks.workunit.client.0.vm05.stderr:osd.0 belongs to no class, done removing class of osd(s): 2026-03-31T19:03:10.708 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:206: TEST_mon_classes: ceph osd tree 2026-03-31T19:03:10.708 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:206: TEST_mon_classes: grep -q aaa 2026-03-31T19:03:10.807 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:207: TEST_mon_classes: ceph osd crush class ls 2026-03-31T19:03:10.807 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:207: TEST_mon_classes: grep -q aaa 2026-03-31T19:03:10.912 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:208: TEST_mon_classes: ceph osd crush rm-device-class 1 2026-03-31T19:03:11.108 INFO:tasks.workunit.client.0.vm05.stderr:osd.1 belongs to no class, done removing class of osd(s): 2026-03-31T19:03:11.117 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:209: TEST_mon_classes: ceph osd tree 2026-03-31T19:03:11.117 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:209: TEST_mon_classes: grep -q bbb 2026-03-31T19:03:11.217 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:210: TEST_mon_classes: ceph osd crush class ls 2026-03-31T19:03:11.217 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:210: TEST_mon_classes: grep -q bbb 2026-03-31T19:03:11.322 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:211: TEST_mon_classes: ceph osd crush rm-device-class 2 2026-03-31T19:03:11.514 INFO:tasks.workunit.client.0.vm05.stderr:osd.2 belongs to no class, done removing class of osd(s): 2026-03-31T19:03:11.523 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:212: TEST_mon_classes: ceph osd tree 2026-03-31T19:03:11.523 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:212: TEST_mon_classes: grep -q ccc 2026-03-31T19:03:11.626 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:213: TEST_mon_classes: ceph osd crush class ls 2026-03-31T19:03:11.626 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:213: TEST_mon_classes: grep -q ccc 2026-03-31T19:03:11.731 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:214: TEST_mon_classes: ceph osd crush set-device-class asdf all 2026-03-31T19:03:11.921 INFO:tasks.workunit.client.0.vm05.stderr:osd.0 already set to class asdf. osd.1 already set to class asdf. osd.2 already set to class asdf. set osd(s) to class 'asdf' 2026-03-31T19:03:11.930 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:215: TEST_mon_classes: ceph osd tree 2026-03-31T19:03:11.930 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:215: TEST_mon_classes: grep -q asdf 2026-03-31T19:03:12.031 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:216: TEST_mon_classes: ceph osd crush dump 2026-03-31T19:03:12.031 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:216: TEST_mon_classes: grep -q '~asdf' 2026-03-31T19:03:12.135 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:217: TEST_mon_classes: ceph osd crush tree --show-shadow 2026-03-31T19:03:12.135 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:217: TEST_mon_classes: grep -q '~asdf' 2026-03-31T19:03:12.243 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:218: TEST_mon_classes: ceph osd crush rule create-replicated asdf-rule default host asdf 2026-03-31T19:03:12.348 INFO:tasks.workunit.client.0.vm05.stderr:rule asdf-rule already exists 2026-03-31T19:03:12.355 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:219: TEST_mon_classes: ceph osd crush rm-device-class all 2026-03-31T19:03:12.634 INFO:tasks.workunit.client.0.vm05.stderr:osd.0 belongs to no class, osd.1 belongs to no class, osd.2 belongs to no class, done removing class of osd(s): 2026-03-31T19:03:12.642 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:220: TEST_mon_classes: ceph osd tree 2026-03-31T19:03:12.642 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:220: TEST_mon_classes: grep -q asdf 2026-03-31T19:03:12.743 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:221: TEST_mon_classes: ceph osd crush class ls 2026-03-31T19:03:12.743 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:221: TEST_mon_classes: grep -q asdf 2026-03-31T19:03:12.849 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:223: TEST_mon_classes: ceph osd crush set-device-class abc osd.2 2026-03-31T19:03:13.042 INFO:tasks.workunit.client.0.vm05.stderr:osd.2 already set to class abc. set-device-class item id 2 name 'osd.2' device_class 'abc': no change. set osd(s) to class 'abc' 2026-03-31T19:03:13.051 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:224: TEST_mon_classes: ceph osd crush move osd.2 root=foo rack=foo-rack host=foo-host 2026-03-31T19:03:13.168 INFO:tasks.workunit.client.0.vm05.stderr:no need to move item id 2 name 'osd.2' to location {host=foo-host,rack=foo-rack,root=foo} in crush map 2026-03-31T19:03:13.175 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:225: TEST_mon_classes: ceph osd tree 2026-03-31T19:03:13.175 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:225: TEST_mon_classes: awk '$1 == 2 && $2 == "abc" {print $0}' 2026-03-31T19:03:13.283 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:225: TEST_mon_classes: out=' 2 abc 0.09769 osd.2 up 1.00000 1.00000' 2026-03-31T19:03:13.283 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:226: TEST_mon_classes: '[' ' 2 abc 0.09769 osd.2 up 1.00000 1.00000' == '' ']' 2026-03-31T19:03:13.283 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:231: TEST_mon_classes: ceph osd crush dump 2026-03-31T19:03:13.283 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:231: TEST_mon_classes: grep -q foo~abc 2026-03-31T19:03:13.393 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:232: TEST_mon_classes: ceph osd crush tree --show-shadow 2026-03-31T19:03:13.393 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:232: TEST_mon_classes: grep -q foo~abc 2026-03-31T19:03:13.505 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:233: TEST_mon_classes: ceph osd crush dump 2026-03-31T19:03:13.505 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:233: TEST_mon_classes: grep -q foo-rack~abc 2026-03-31T19:03:13.610 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:234: TEST_mon_classes: ceph osd crush tree --show-shadow 2026-03-31T19:03:13.610 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:234: TEST_mon_classes: grep -q foo-rack~abc 2026-03-31T19:03:13.721 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:235: TEST_mon_classes: ceph osd crush dump 2026-03-31T19:03:13.721 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:235: TEST_mon_classes: grep -q foo-host~abc 2026-03-31T19:03:13.829 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:236: TEST_mon_classes: ceph osd crush tree --show-shadow 2026-03-31T19:03:13.829 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:236: TEST_mon_classes: grep -q foo-host~abc 2026-03-31T19:03:13.935 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:237: TEST_mon_classes: ceph osd crush rm-device-class osd.2 2026-03-31T19:03:14.166 INFO:tasks.workunit.client.0.vm05.stderr:osd.2 belongs to no class, done removing class of osd(s): 2026-03-31T19:03:14.174 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:239: TEST_mon_classes: ceph osd crush set-device-class abc osd.2 2026-03-31T19:03:14.373 INFO:tasks.workunit.client.0.vm05.stderr:osd.2 already set to class abc. set-device-class item id 2 name 'osd.2' device_class 'abc': no change. set osd(s) to class 'abc' 2026-03-31T19:03:14.381 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:241: TEST_mon_classes: ceph osd crush rule create-replicated foo-rule foo host abc 2026-03-31T19:03:14.496 INFO:tasks.workunit.client.0.vm05.stderr:rule foo-rule already exists 2026-03-31T19:03:14.502 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:244: TEST_mon_classes: ceph osd crush set-device-class hdd osd.0 2026-03-31T19:03:14.785 INFO:tasks.workunit.client.0.vm05.stderr:osd.0 already set to class hdd. set-device-class item id 0 name 'osd.0' device_class 'hdd': no change. set osd(s) to class 'hdd' 2026-03-31T19:03:14.793 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:245: TEST_mon_classes: expect_failure td/crush-classes EBUSY ceph osd crush set-device-class nvme osd.0 2026-03-31T19:03:14.793 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2021: expect_failure: local dir=td/crush-classes 2026-03-31T19:03:14.793 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2022: expect_failure: shift 2026-03-31T19:03:14.793 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2023: expect_failure: local expected=EBUSY 2026-03-31T19:03:14.793 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2024: expect_failure: shift 2026-03-31T19:03:14.793 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2025: expect_failure: local success 2026-03-31T19:03:14.793 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2027: expect_failure: ceph osd crush set-device-class nvme osd.0 2026-03-31T19:03:14.880 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2030: expect_failure: success=false 2026-03-31T19:03:14.880 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2033: expect_failure: false 2026-03-31T19:03:14.880 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2033: expect_failure: grep --quiet EBUSY td/crush-classes/out 2026-03-31T19:03:14.881 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2037: expect_failure: return 0 2026-03-31T19:03:14.881 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:248: TEST_mon_classes: ceph osd crush rm-device-class all 2026-03-31T19:03:15.093 INFO:tasks.workunit.client.0.vm05.stderr:osd.0 belongs to no class, osd.1 belongs to no class, osd.2 belongs to no class, done removing class of osd(s): 2026-03-31T19:03:15.101 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:249: TEST_mon_classes: ceph osd crush set-device-class class_1 all 2026-03-31T19:03:15.297 INFO:tasks.workunit.client.0.vm05.stderr:osd.0 already set to class class_1. osd.1 already set to class class_1. osd.2 already set to class class_1. set osd(s) to class 'class_1' 2026-03-31T19:03:15.305 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:250: TEST_mon_classes: ceph osd crush class ls 2026-03-31T19:03:15.305 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:250: TEST_mon_classes: grep class_1 2026-03-31T19:03:15.412 INFO:tasks.workunit.client.0.vm05.stdout: "class_1" 2026-03-31T19:03:15.412 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:251: TEST_mon_classes: ceph osd crush tree --show-shadow 2026-03-31T19:03:15.412 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:251: TEST_mon_classes: grep class_1 2026-03-31T19:03:15.519 INFO:tasks.workunit.client.0.vm05.stdout:-20 class_1 0.19537 root default~class_1 2026-03-31T19:03:15.519 INFO:tasks.workunit.client.0.vm05.stdout:-19 class_1 0.19537 host vm05~class_1 2026-03-31T19:03:15.519 INFO:tasks.workunit.client.0.vm05.stdout: 0 class_1 0.09769 osd.0 2026-03-31T19:03:15.519 INFO:tasks.workunit.client.0.vm05.stdout: 1 class_1 0.09769 osd.1 2026-03-31T19:03:15.519 INFO:tasks.workunit.client.0.vm05.stdout:-18 class_1 0.09769 root foo~class_1 2026-03-31T19:03:15.519 INFO:tasks.workunit.client.0.vm05.stdout:-17 class_1 0.09769 rack foo-rack~class_1 2026-03-31T19:03:15.519 INFO:tasks.workunit.client.0.vm05.stdout:-16 class_1 0.09769 host foo-host~class_1 2026-03-31T19:03:15.519 INFO:tasks.workunit.client.0.vm05.stdout: 2 class_1 0.09769 osd.2 2026-03-31T19:03:15.519 INFO:tasks.workunit.client.0.vm05.stdout: 2 class_1 0.09769 osd.2 2026-03-31T19:03:15.519 INFO:tasks.workunit.client.0.vm05.stdout: 0 class_1 0.09769 osd.0 2026-03-31T19:03:15.519 INFO:tasks.workunit.client.0.vm05.stdout: 1 class_1 0.09769 osd.1 2026-03-31T19:03:15.519 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:252: TEST_mon_classes: ceph osd crush rule create-replicated class_1_rule default host class_1 2026-03-31T19:03:15.728 INFO:tasks.workunit.client.0.vm05.stderr:rule class_1_rule already exists 2026-03-31T19:03:15.735 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:253: TEST_mon_classes: ceph osd crush class rename class_1 class_2 2026-03-31T19:03:15.931 INFO:tasks.workunit.client.0.vm05.stderr:already renamed to 'class_2' 2026-03-31T19:03:15.938 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:254: TEST_mon_classes: ceph osd crush class rename class_1 class_2 2026-03-31T19:03:16.043 INFO:tasks.workunit.client.0.vm05.stderr:already renamed to 'class_2' 2026-03-31T19:03:16.050 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:255: TEST_mon_classes: ceph osd crush class ls 2026-03-31T19:03:16.050 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:255: TEST_mon_classes: grep class_1 2026-03-31T19:03:16.164 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:256: TEST_mon_classes: ceph osd crush tree --show-shadow 2026-03-31T19:03:16.164 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:256: TEST_mon_classes: grep class_1 2026-03-31T19:03:16.289 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:257: TEST_mon_classes: ceph osd crush class ls 2026-03-31T19:03:16.289 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:257: TEST_mon_classes: grep class_2 2026-03-31T19:03:16.404 INFO:tasks.workunit.client.0.vm05.stdout: "class_2" 2026-03-31T19:03:16.405 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:258: TEST_mon_classes: ceph osd crush tree --show-shadow 2026-03-31T19:03:16.405 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:258: TEST_mon_classes: grep class_2 2026-03-31T19:03:16.519 INFO:tasks.workunit.client.0.vm05.stdout:-20 class_2 0.19537 root default~class_2 2026-03-31T19:03:16.519 INFO:tasks.workunit.client.0.vm05.stdout:-19 class_2 0.19537 host vm05~class_2 2026-03-31T19:03:16.519 INFO:tasks.workunit.client.0.vm05.stdout: 0 class_2 0.09769 osd.0 2026-03-31T19:03:16.519 INFO:tasks.workunit.client.0.vm05.stdout: 1 class_2 0.09769 osd.1 2026-03-31T19:03:16.519 INFO:tasks.workunit.client.0.vm05.stdout:-18 class_2 0.09769 root foo~class_2 2026-03-31T19:03:16.519 INFO:tasks.workunit.client.0.vm05.stdout:-17 class_2 0.09769 rack foo-rack~class_2 2026-03-31T19:03:16.519 INFO:tasks.workunit.client.0.vm05.stdout:-16 class_2 0.09769 host foo-host~class_2 2026-03-31T19:03:16.519 INFO:tasks.workunit.client.0.vm05.stdout: 2 class_2 0.09769 osd.2 2026-03-31T19:03:16.519 INFO:tasks.workunit.client.0.vm05.stdout: 2 class_2 0.09769 osd.2 2026-03-31T19:03:16.519 INFO:tasks.workunit.client.0.vm05.stdout: 0 class_2 0.09769 osd.0 2026-03-31T19:03:16.519 INFO:tasks.workunit.client.0.vm05.stdout: 1 class_2 0.09769 osd.1 2026-03-31T19:03:16.519 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:37: run: teardown td/crush-classes 2026-03-31T19:03:16.519 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:155: teardown: local dir=td/crush-classes 2026-03-31T19:03:16.519 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:156: teardown: local dumplogs= 2026-03-31T19:03:16.519 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:157: teardown: kill_daemons td/crush-classes KILL 2026-03-31T19:03:16.519 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:03:16.519 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:03:16.519 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:03:16.519 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:03:16.519 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:03:16.629 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:03:16.629 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: uname 2026-03-31T19:03:16.629 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-31T19:03:16.630 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: stat -f -c %T . 2026-03-31T19:03:16.631 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: '[' xfs == btrfs ']' 2026-03-31T19:03:16.631 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:162: teardown: local cores=no 2026-03-31T19:03:16.631 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: sysctl -n kernel.core_pattern 2026-03-31T19:03:16.632 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:03:16.632 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: '[' / = '|' ']' 2026-03-31T19:03:16.632 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: grep -q '^core\|core$' 2026-03-31T19:03:16.632 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:03:16.633 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:03:16.634 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: '[' no = yes -o '' = 1 ']' 2026-03-31T19:03:16.634 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: rm -fr td/crush-classes 2026-03-31T19:03:16.644 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: get_asok_dir 2026-03-31T19:03:16.644 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:16.644 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:16.644 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: rm -rf /tmp/ceph-asok.65997 2026-03-31T19:03:16.645 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:191: teardown: '[' no = yes ']' 2026-03-31T19:03:16.645 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: return 0 2026-03-31T19:03:16.645 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:34: run: for func in $funcs 2026-03-31T19:03:16.645 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:35: run: setup td/crush-classes 2026-03-31T19:03:16.645 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:122: setup: local dir=td/crush-classes 2026-03-31T19:03:16.645 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:123: setup: teardown td/crush-classes 2026-03-31T19:03:16.645 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:155: teardown: local dir=td/crush-classes 2026-03-31T19:03:16.645 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:156: teardown: local dumplogs= 2026-03-31T19:03:16.645 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:157: teardown: kill_daemons td/crush-classes KILL 2026-03-31T19:03:16.645 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:03:16.645 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:03:16.645 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:03:16.645 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:03:16.645 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:03:16.647 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:03:16.647 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: uname 2026-03-31T19:03:16.647 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-31T19:03:16.648 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: stat -f -c %T . 2026-03-31T19:03:16.649 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: '[' xfs == btrfs ']' 2026-03-31T19:03:16.649 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:162: teardown: local cores=no 2026-03-31T19:03:16.649 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: sysctl -n kernel.core_pattern 2026-03-31T19:03:16.649 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:03:16.649 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: '[' / = '|' ']' 2026-03-31T19:03:16.650 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: grep -q '^core\|core$' 2026-03-31T19:03:16.650 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:03:16.650 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:03:16.651 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: '[' no = yes -o '' = 1 ']' 2026-03-31T19:03:16.651 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: rm -fr td/crush-classes 2026-03-31T19:03:16.652 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: get_asok_dir 2026-03-31T19:03:16.652 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:16.652 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:16.652 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: rm -rf /tmp/ceph-asok.65997 2026-03-31T19:03:16.653 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:191: teardown: '[' no = yes ']' 2026-03-31T19:03:16.653 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: return 0 2026-03-31T19:03:16.653 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:124: setup: mkdir -p td/crush-classes 2026-03-31T19:03:16.654 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:125: setup: get_asok_dir 2026-03-31T19:03:16.654 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:16.654 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:16.654 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:125: setup: mkdir -p /tmp/ceph-asok.65997 2026-03-31T19:03:16.655 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:126: setup: ulimit -n 2026-03-31T19:03:16.655 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:126: setup: '[' 4096 -le 1024 ']' 2026-03-31T19:03:16.655 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:129: setup: '[' -z '' ']' 2026-03-31T19:03:16.655 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:130: setup: trap 'teardown td/crush-classes 1' TERM HUP INT 2026-03-31T19:03:16.655 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:36: run: TEST_reweight_vs_classes td/crush-classes 2026-03-31T19:03:16.655 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:61: TEST_reweight_vs_classes: local dir=td/crush-classes 2026-03-31T19:03:16.655 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:66: TEST_reweight_vs_classes: run_mon td/crush-classes a 2026-03-31T19:03:16.655 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:439: run_mon: local dir=td/crush-classes 2026-03-31T19:03:16.656 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:440: run_mon: shift 2026-03-31T19:03:16.656 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:441: run_mon: local id=a 2026-03-31T19:03:16.656 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:442: run_mon: shift 2026-03-31T19:03:16.656 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:443: run_mon: local data=td/crush-classes/a 2026-03-31T19:03:16.656 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:446: run_mon: ceph-mon --id a --mkfs --mon-data=td/crush-classes/a --run-dir=td/crush-classes 2026-03-31T19:03:16.682 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:453: run_mon: get_asok_path 2026-03-31T19:03:16.682 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:03:16.682 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:03:16.682 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:03:16.682 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:16.682 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:16.682 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:03:16.682 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:453: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/crush-classes/a '--log-file=td/crush-classes/$name.log' '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --mon-cluster-log-file=td/crush-classes/log --run-dir=td/crush-classes '--pid-file=td/crush-classes/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-31T19:03:16.711 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: cat 2026-03-31T19:03:16.711 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: get_config mon a fsid 2026-03-31T19:03:16.711 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1119: get_config: local daemon=mon 2026-03-31T19:03:16.711 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1120: get_config: local id=a 2026-03-31T19:03:16.711 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1121: get_config: local config=fsid 2026-03-31T19:03:16.712 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: get_asok_path mon.a 2026-03-31T19:03:16.712 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name=mon.a 2026-03-31T19:03:16.712 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n mon.a ']' 2026-03-31T19:03:16.712 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: get_asok_dir 2026-03-31T19:03:16.712 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:16.712 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:16.712 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: echo /tmp/ceph-asok.65997/ceph-mon.a.asok 2026-03-31T19:03:16.713 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: local daemon_asok=/tmp/ceph-asok.65997/ceph-mon.a.asok 2026-03-31T19:03:16.713 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: CEPH_ARGS= 2026-03-31T19:03:16.713 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: ceph --format json daemon /tmp/ceph-asok.65997/ceph-mon.a.asok config get fsid 2026-03-31T19:03:16.713 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: jq -r .fsid 2026-03-31T19:03:16.761 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: get_config mon a mon_host 2026-03-31T19:03:16.761 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1119: get_config: local daemon=mon 2026-03-31T19:03:16.762 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1120: get_config: local id=a 2026-03-31T19:03:16.762 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1121: get_config: local config=mon_host 2026-03-31T19:03:16.762 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: get_asok_path mon.a 2026-03-31T19:03:16.762 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name=mon.a 2026-03-31T19:03:16.762 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n mon.a ']' 2026-03-31T19:03:16.762 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: get_asok_dir 2026-03-31T19:03:16.762 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:16.762 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:16.762 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: echo /tmp/ceph-asok.65997/ceph-mon.a.asok 2026-03-31T19:03:16.762 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: local daemon_asok=/tmp/ceph-asok.65997/ceph-mon.a.asok 2026-03-31T19:03:16.762 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: CEPH_ARGS= 2026-03-31T19:03:16.762 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: ceph --format json daemon /tmp/ceph-asok.65997/ceph-mon.a.asok config get mon_host 2026-03-31T19:03:16.762 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: jq -r .mon_host 2026-03-31T19:03:16.809 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:67: TEST_reweight_vs_classes: run_osd td/crush-classes 0 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-classes 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=0 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-classes/0 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-classes/0' 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-classes/0/journal' 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:03:16.810 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:03:16.811 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:03:16.811 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-31T19:03:16.811 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-31T19:03:16.811 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:03:16.811 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:03:16.811 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:03:16.811 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:03:16.811 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:03:16.811 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:03:16.811 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-classes/0 2026-03-31T19:03:16.812 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:03:16.813 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=fd1de8c0-24ef-421d-a58f-442dbe3b843c 2026-03-31T19:03:16.813 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd0 fd1de8c0-24ef-421d-a58f-442dbe3b843c' 2026-03-31T19:03:16.813 INFO:tasks.workunit.client.0.vm05.stdout:add osd0 fd1de8c0-24ef-421d-a58f-442dbe3b843c 2026-03-31T19:03:16.813 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:03:16.826 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQB0Gsxp/Rk2MRAAHOLU4sHbi7SHCkXsToQ3qg== 2026-03-31T19:03:16.826 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQB0Gsxp/Rk2MRAAHOLU4sHbi7SHCkXsToQ3qg=="}' 2026-03-31T19:03:16.826 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new fd1de8c0-24ef-421d-a58f-442dbe3b843c -i td/crush-classes/0/new.json 2026-03-31T19:03:16.940 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:03:16.948 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-classes/0/new.json 2026-03-31T19:03:16.949 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 0 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/0 --osd-journal=td/crush-classes/0/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQB0Gsxp/Rk2MRAAHOLU4sHbi7SHCkXsToQ3qg== --osd-uuid fd1de8c0-24ef-421d-a58f-442dbe3b843c 2026-03-31T19:03:16.969 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:16.967+0000 7f39706af900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:16.970 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:16.969+0000 7f39706af900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:16.971 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:16.970+0000 7f39706af900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:16.971 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:16.970+0000 7f39706af900 -1 bdev(0x55bcfbc68c00 td/crush-classes/0/block) open stat got: (1) Operation not permitted 2026-03-31T19:03:16.972 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:16.971+0000 7f39706af900 -1 bluestore(td/crush-classes/0) _read_fsid unparsable uuid 2026-03-31T19:03:17.392 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-classes/0/keyring 2026-03-31T19:03:17.392 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:03:17.393 INFO:tasks.workunit.client.0.vm05.stdout:adding osd0 key to auth repository 2026-03-31T19:03:17.393 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd0 key to auth repository 2026-03-31T19:03:17.393 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-classes/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:03:17.501 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.0 2026-03-31T19:03:17.501 INFO:tasks.workunit.client.0.vm05.stdout:start osd.0 2026-03-31T19:03:17.501 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 0 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/0 --osd-journal=td/crush-classes/0/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:03:17.501 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:03:17.501 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:03:17.501 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:03:17.519 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:17.517+0000 7f3269fc1900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:17.520 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:17.519+0000 7f3269fc1900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:17.522 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:17.520+0000 7f3269fc1900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:17.607 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 0 2026-03-31T19:03:17.607 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:03:17.607 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=0 2026-03-31T19:03:17.607 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:03:17.607 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:03:17.607 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:03:17.607 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:03:17.608 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:03:17.608 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:03:17.608 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.0 up' 2026-03-31T19:03:17.657 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:17.655+0000 7f3269fc1900 -1 Falling back to public interface 2026-03-31T19:03:17.711 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:03:17.808 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:17.807+0000 7f3269fc1900 -1 osd.0 0 log_to_monitors true 2026-03-31T19:03:18.713 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:03:18.713 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:03:18.713 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:03:18.713 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:03:18.713 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:03:18.713 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.0 up' 2026-03-31T19:03:18.818 INFO:tasks.workunit.client.0.vm05.stdout:osd.0 up in weight 1 up_from 4 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6800/115562036,v1:127.0.0.1:6801/115562036] [v2:127.0.0.1:6802/115562036,v1:127.0.0.1:6803/115562036] exists,up fd1de8c0-24ef-421d-a58f-442dbe3b843c 2026-03-31T19:03:18.819 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:03:18.819 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:03:18.819 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:03:18.819 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:68: TEST_reweight_vs_classes: run_osd td/crush-classes 1 2026-03-31T19:03:18.819 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-classes 2026-03-31T19:03:18.819 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:03:18.819 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=1 2026-03-31T19:03:18.819 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:03:18.819 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-classes/1 2026-03-31T19:03:18.819 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-31T19:03:18.819 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:03:18.819 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:03:18.819 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:03:18.819 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-classes/1' 2026-03-31T19:03:18.819 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-classes/1/journal' 2026-03-31T19:03:18.819 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:03:18.819 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:03:18.819 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-31T19:03:18.819 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:03:18.820 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:03:18.820 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:03:18.820 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:03:18.820 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:18.820 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:18.820 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:03:18.820 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:03:18.821 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:03:18.821 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:03:18.821 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:03:18.821 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-31T19:03:18.821 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-31T19:03:18.821 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:03:18.821 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:03:18.821 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:03:18.821 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:03:18.821 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:03:18.821 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:03:18.821 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-classes/1 2026-03-31T19:03:18.822 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:03:18.822 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=e93deaf4-b297-4404-abce-5bdf7998a808 2026-03-31T19:03:18.822 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd1 e93deaf4-b297-4404-abce-5bdf7998a808' 2026-03-31T19:03:18.823 INFO:tasks.workunit.client.0.vm05.stdout:add osd1 e93deaf4-b297-4404-abce-5bdf7998a808 2026-03-31T19:03:18.823 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:03:18.835 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQB2GsxpbczCMRAA0PfawS/9NMHJQPerVvSVFg== 2026-03-31T19:03:18.835 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQB2GsxpbczCMRAA0PfawS/9NMHJQPerVvSVFg=="}' 2026-03-31T19:03:18.836 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new e93deaf4-b297-4404-abce-5bdf7998a808 -i td/crush-classes/1/new.json 2026-03-31T19:03:18.960 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:03:18.967 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-classes/1/new.json 2026-03-31T19:03:18.968 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 1 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/1 --osd-journal=td/crush-classes/1/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQB2GsxpbczCMRAA0PfawS/9NMHJQPerVvSVFg== --osd-uuid e93deaf4-b297-4404-abce-5bdf7998a808 2026-03-31T19:03:18.987 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:18.986+0000 7fc44d352900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:18.989 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:18.987+0000 7fc44d352900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:18.990 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:18.988+0000 7fc44d352900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:18.990 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:18.989+0000 7fc44d352900 -1 bdev(0x560a2aae8c00 td/crush-classes/1/block) open stat got: (1) Operation not permitted 2026-03-31T19:03:18.990 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:18.989+0000 7fc44d352900 -1 bluestore(td/crush-classes/1) _read_fsid unparsable uuid 2026-03-31T19:03:19.462 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-classes/1/keyring 2026-03-31T19:03:19.462 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:03:19.463 INFO:tasks.workunit.client.0.vm05.stdout:adding osd1 key to auth repository 2026-03-31T19:03:19.463 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd1 key to auth repository 2026-03-31T19:03:19.463 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-classes/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:03:19.576 INFO:tasks.workunit.client.0.vm05.stdout:start osd.1 2026-03-31T19:03:19.576 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.1 2026-03-31T19:03:19.576 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 1 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/1 --osd-journal=td/crush-classes/1/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:03:19.576 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:03:19.576 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:03:19.577 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:03:19.594 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:19.593+0000 7f1a988ce900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:19.596 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:19.594+0000 7f1a988ce900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:19.597 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:19.595+0000 7f1a988ce900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:19.680 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 1 2026-03-31T19:03:19.680 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:03:19.681 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=1 2026-03-31T19:03:19.681 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:03:19.681 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:03:19.681 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:03:19.681 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:03:19.681 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:03:19.681 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:03:19.681 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.1 up' 2026-03-31T19:03:19.739 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:19.738+0000 7f1a988ce900 -1 Falling back to public interface 2026-03-31T19:03:19.784 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:03:19.867 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:19.866+0000 7f1a988ce900 -1 osd.1 0 log_to_monitors true 2026-03-31T19:03:20.785 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:03:20.785 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:03:20.786 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:03:20.786 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:03:20.786 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:03:20.786 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.1 up' 2026-03-31T19:03:20.895 INFO:tasks.workunit.client.0.vm05.stdout:osd.1 up in weight 1 up_from 8 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6808/1914853342,v1:127.0.0.1:6809/1914853342] [v2:127.0.0.1:6810/1914853342,v1:127.0.0.1:6811/1914853342] exists,up e93deaf4-b297-4404-abce-5bdf7998a808 2026-03-31T19:03:20.895 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:03:20.895 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:03:20.895 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:03:20.895 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:69: TEST_reweight_vs_classes: run_osd td/crush-classes 2 2026-03-31T19:03:20.895 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-classes 2026-03-31T19:03:20.895 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:03:20.895 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=2 2026-03-31T19:03:20.895 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:03:20.895 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-classes/2 2026-03-31T19:03:20.895 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-31T19:03:20.895 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:03:20.895 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:03:20.895 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:03:20.895 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-classes/2' 2026-03-31T19:03:20.895 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-classes/2/journal' 2026-03-31T19:03:20.895 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:03:20.895 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:03:20.896 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-31T19:03:20.896 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:03:20.896 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:03:20.896 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:03:20.896 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:03:20.896 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:20.896 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:20.896 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:03:20.896 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:03:20.896 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:03:20.896 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:03:20.896 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:03:20.896 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-31T19:03:20.896 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-31T19:03:20.896 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:03:20.896 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:03:20.896 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:03:20.896 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:03:20.897 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:03:20.897 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:03:20.897 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-classes/2 2026-03-31T19:03:20.898 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:03:20.899 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=a49cc52d-0de3-4114-b467-459cb896d6c1 2026-03-31T19:03:20.899 INFO:tasks.workunit.client.0.vm05.stdout:add osd2 a49cc52d-0de3-4114-b467-459cb896d6c1 2026-03-31T19:03:20.899 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd2 a49cc52d-0de3-4114-b467-459cb896d6c1' 2026-03-31T19:03:20.899 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:03:20.912 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQB4GsxprAJUNhAA8teEFB8hXHFUIA5mqWuFJQ== 2026-03-31T19:03:20.912 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQB4GsxprAJUNhAA8teEFB8hXHFUIA5mqWuFJQ=="}' 2026-03-31T19:03:20.912 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new a49cc52d-0de3-4114-b467-459cb896d6c1 -i td/crush-classes/2/new.json 2026-03-31T19:03:21.042 INFO:tasks.workunit.client.0.vm05.stdout:2 2026-03-31T19:03:21.051 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-classes/2/new.json 2026-03-31T19:03:21.051 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 2 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/2 --osd-journal=td/crush-classes/2/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQB4GsxprAJUNhAA8teEFB8hXHFUIA5mqWuFJQ== --osd-uuid a49cc52d-0de3-4114-b467-459cb896d6c1 2026-03-31T19:03:21.071 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:21.070+0000 7f4b18d73900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:21.073 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:21.072+0000 7f4b18d73900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:21.075 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:21.073+0000 7f4b18d73900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:21.075 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:21.074+0000 7f4b18d73900 -1 bdev(0x56478c8a4c00 td/crush-classes/2/block) open stat got: (1) Operation not permitted 2026-03-31T19:03:21.075 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:21.074+0000 7f4b18d73900 -1 bluestore(td/crush-classes/2) _read_fsid unparsable uuid 2026-03-31T19:03:21.517 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-classes/2/keyring 2026-03-31T19:03:21.517 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:03:21.518 INFO:tasks.workunit.client.0.vm05.stdout:adding osd2 key to auth repository 2026-03-31T19:03:21.518 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd2 key to auth repository 2026-03-31T19:03:21.518 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-classes/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:03:21.640 INFO:tasks.workunit.client.0.vm05.stdout:start osd.2 2026-03-31T19:03:21.640 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.2 2026-03-31T19:03:21.640 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 2 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/2 --osd-journal=td/crush-classes/2/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:03:21.640 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:03:21.640 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:03:21.640 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:03:21.667 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:21.666+0000 7f3aada5b900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:21.669 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:21.668+0000 7f3aada5b900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:21.670 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:21.669+0000 7f3aada5b900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:21.761 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 2 2026-03-31T19:03:21.761 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:03:21.761 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=2 2026-03-31T19:03:21.761 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:03:21.761 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:03:21.761 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:03:21.762 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:03:21.762 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:03:21.762 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:03:21.762 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.2 up' 2026-03-31T19:03:21.829 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:21.828+0000 7f3aada5b900 -1 Falling back to public interface 2026-03-31T19:03:21.875 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:03:21.987 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:21.986+0000 7f3aada5b900 -1 osd.2 0 log_to_monitors true 2026-03-31T19:03:22.876 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:03:22.877 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:03:22.877 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:03:22.877 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:03:22.877 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:03:22.877 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.2 up' 2026-03-31T19:03:22.991 INFO:tasks.workunit.client.0.vm05.stdout:osd.2 up in weight 1 up_from 12 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6816/2896947577,v1:127.0.0.1:6817/2896947577] [v2:127.0.0.1:6818/2896947577,v1:127.0.0.1:6819/2896947577] exists,up a49cc52d-0de3-4114-b467-459cb896d6c1 2026-03-31T19:03:22.992 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:03:22.992 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:03:22.992 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:03:22.992 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:71: TEST_reweight_vs_classes: ceph osd crush set-device-class ssd osd.0 2026-03-31T19:03:23.267 INFO:tasks.workunit.client.0.vm05.stderr:osd.0 already set to class ssd. set-device-class item id 0 name 'osd.0' device_class 'ssd': no change. set osd(s) to class 'ssd' 2026-03-31T19:03:23.275 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:72: TEST_reweight_vs_classes: ceph osd crush class ls-osd ssd 2026-03-31T19:03:23.275 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:72: TEST_reweight_vs_classes: grep 0 2026-03-31T19:03:23.385 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:03:23.385 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:73: TEST_reweight_vs_classes: ceph osd crush set-device-class ssd osd.1 2026-03-31T19:03:23.573 INFO:tasks.workunit.client.0.vm05.stderr:osd.1 already set to class ssd. set-device-class item id 1 name 'osd.1' device_class 'ssd': no change. set osd(s) to class 'ssd' 2026-03-31T19:03:23.582 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:74: TEST_reweight_vs_classes: ceph osd crush class ls-osd ssd 2026-03-31T19:03:23.582 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:74: TEST_reweight_vs_classes: grep 1 2026-03-31T19:03:23.692 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:03:23.692 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:76: TEST_reweight_vs_classes: ceph osd crush reweight osd.0 1 2026-03-31T19:03:23.880 INFO:tasks.workunit.client.0.vm05.stderr:reweighted item id 0 name 'osd.0' to 1 in crush map 2026-03-31T19:03:23.888 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:78: TEST_reweight_vs_classes: hostname -s 2026-03-31T19:03:23.888 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:78: TEST_reweight_vs_classes: h=vm05 2026-03-31T19:03:23.889 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:79: TEST_reweight_vs_classes: ceph osd crush dump 2026-03-31T19:03:23.889 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:79: TEST_reweight_vs_classes: jq '.buckets[] | select(.name=="vm05") | .items[0].weight' 2026-03-31T19:03:23.889 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:79: TEST_reweight_vs_classes: grep 65536 2026-03-31T19:03:23.994 INFO:tasks.workunit.client.0.vm05.stdout:65536 2026-03-31T19:03:23.995 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:80: TEST_reweight_vs_classes: ceph osd crush dump 2026-03-31T19:03:23.995 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:80: TEST_reweight_vs_classes: jq '.buckets[] | select(.name=="vm05~ssd") | .items[0].weight' 2026-03-31T19:03:23.995 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:80: TEST_reweight_vs_classes: grep 65536 2026-03-31T19:03:24.106 INFO:tasks.workunit.client.0.vm05.stdout:65536 2026-03-31T19:03:24.106 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:82: TEST_reweight_vs_classes: ceph osd crush set 0 2 host=vm05 2026-03-31T19:03:24.308 INFO:tasks.workunit.client.0.vm05.stderr:set item id 0 name 'osd.0' weight 2 at location {host=vm05}: no change 2026-03-31T19:03:24.315 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:84: TEST_reweight_vs_classes: ceph osd crush dump 2026-03-31T19:03:24.315 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:84: TEST_reweight_vs_classes: jq '.buckets[] | select(.name=="vm05") | .items[0].weight' 2026-03-31T19:03:24.315 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:84: TEST_reweight_vs_classes: grep 131072 2026-03-31T19:03:24.426 INFO:tasks.workunit.client.0.vm05.stdout:131072 2026-03-31T19:03:24.426 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:85: TEST_reweight_vs_classes: ceph osd crush dump 2026-03-31T19:03:24.427 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:85: TEST_reweight_vs_classes: jq '.buckets[] | select(.name=="vm05~ssd") | .items[0].weight' 2026-03-31T19:03:24.427 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:85: TEST_reweight_vs_classes: grep 131072 2026-03-31T19:03:24.537 INFO:tasks.workunit.client.0.vm05.stdout:131072 2026-03-31T19:03:24.537 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:37: run: teardown td/crush-classes 2026-03-31T19:03:24.537 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:155: teardown: local dir=td/crush-classes 2026-03-31T19:03:24.537 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:156: teardown: local dumplogs= 2026-03-31T19:03:24.537 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:157: teardown: kill_daemons td/crush-classes KILL 2026-03-31T19:03:24.537 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:03:24.538 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:03:24.538 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:03:24.538 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:03:24.538 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:03:24.646 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:03:24.646 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: uname 2026-03-31T19:03:24.647 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-31T19:03:24.647 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: stat -f -c %T . 2026-03-31T19:03:24.648 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: '[' xfs == btrfs ']' 2026-03-31T19:03:24.648 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:162: teardown: local cores=no 2026-03-31T19:03:24.648 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: sysctl -n kernel.core_pattern 2026-03-31T19:03:24.649 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:03:24.649 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: '[' / = '|' ']' 2026-03-31T19:03:24.649 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: grep -q '^core\|core$' 2026-03-31T19:03:24.649 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:03:24.650 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:03:24.651 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: '[' no = yes -o '' = 1 ']' 2026-03-31T19:03:24.651 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: rm -fr td/crush-classes 2026-03-31T19:03:24.661 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: get_asok_dir 2026-03-31T19:03:24.661 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:24.661 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:24.661 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: rm -rf /tmp/ceph-asok.65997 2026-03-31T19:03:24.662 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:191: teardown: '[' no = yes ']' 2026-03-31T19:03:24.662 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: return 0 2026-03-31T19:03:24.662 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:34: run: for func in $funcs 2026-03-31T19:03:24.662 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:35: run: setup td/crush-classes 2026-03-31T19:03:24.662 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:122: setup: local dir=td/crush-classes 2026-03-31T19:03:24.662 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:123: setup: teardown td/crush-classes 2026-03-31T19:03:24.662 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:155: teardown: local dir=td/crush-classes 2026-03-31T19:03:24.662 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:156: teardown: local dumplogs= 2026-03-31T19:03:24.662 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:157: teardown: kill_daemons td/crush-classes KILL 2026-03-31T19:03:24.662 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:03:24.662 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:03:24.662 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:03:24.662 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:03:24.662 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:03:24.664 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:03:24.664 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: uname 2026-03-31T19:03:24.664 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-31T19:03:24.665 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: stat -f -c %T . 2026-03-31T19:03:24.665 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: '[' xfs == btrfs ']' 2026-03-31T19:03:24.666 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:162: teardown: local cores=no 2026-03-31T19:03:24.666 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: sysctl -n kernel.core_pattern 2026-03-31T19:03:24.666 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:03:24.666 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: '[' / = '|' ']' 2026-03-31T19:03:24.667 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: grep -q '^core\|core$' 2026-03-31T19:03:24.667 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:03:24.668 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:03:24.668 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: '[' no = yes -o '' = 1 ']' 2026-03-31T19:03:24.668 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: rm -fr td/crush-classes 2026-03-31T19:03:24.669 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: get_asok_dir 2026-03-31T19:03:24.669 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:24.669 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:24.670 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: rm -rf /tmp/ceph-asok.65997 2026-03-31T19:03:24.670 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:191: teardown: '[' no = yes ']' 2026-03-31T19:03:24.670 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: return 0 2026-03-31T19:03:24.670 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:124: setup: mkdir -p td/crush-classes 2026-03-31T19:03:24.671 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:125: setup: get_asok_dir 2026-03-31T19:03:24.671 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:24.671 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:24.672 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:125: setup: mkdir -p /tmp/ceph-asok.65997 2026-03-31T19:03:24.673 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:126: setup: ulimit -n 2026-03-31T19:03:24.673 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:126: setup: '[' 4096 -le 1024 ']' 2026-03-31T19:03:24.673 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:129: setup: '[' -z '' ']' 2026-03-31T19:03:24.673 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:130: setup: trap 'teardown td/crush-classes 1' TERM HUP INT 2026-03-31T19:03:24.673 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:36: run: TEST_set_device_class td/crush-classes 2026-03-31T19:03:24.673 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:142: TEST_set_device_class: local dir=td/crush-classes 2026-03-31T19:03:24.673 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:144: TEST_set_device_class: TEST_classes td/crush-classes 2026-03-31T19:03:24.673 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:89: TEST_classes: local dir=td/crush-classes 2026-03-31T19:03:24.673 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:91: TEST_classes: run_mon td/crush-classes a 2026-03-31T19:03:24.673 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:439: run_mon: local dir=td/crush-classes 2026-03-31T19:03:24.673 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:440: run_mon: shift 2026-03-31T19:03:24.673 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:441: run_mon: local id=a 2026-03-31T19:03:24.673 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:442: run_mon: shift 2026-03-31T19:03:24.673 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:443: run_mon: local data=td/crush-classes/a 2026-03-31T19:03:24.673 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:446: run_mon: ceph-mon --id a --mkfs --mon-data=td/crush-classes/a --run-dir=td/crush-classes 2026-03-31T19:03:24.695 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:453: run_mon: get_asok_path 2026-03-31T19:03:24.695 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:03:24.696 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:03:24.696 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:03:24.696 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:24.696 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:24.696 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:03:24.696 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:453: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/crush-classes/a '--log-file=td/crush-classes/$name.log' '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --mon-cluster-log-file=td/crush-classes/log --run-dir=td/crush-classes '--pid-file=td/crush-classes/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-31T19:03:24.726 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: cat 2026-03-31T19:03:24.727 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: get_config mon a fsid 2026-03-31T19:03:24.727 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1119: get_config: local daemon=mon 2026-03-31T19:03:24.727 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1120: get_config: local id=a 2026-03-31T19:03:24.727 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1121: get_config: local config=fsid 2026-03-31T19:03:24.727 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: get_asok_path mon.a 2026-03-31T19:03:24.727 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name=mon.a 2026-03-31T19:03:24.727 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n mon.a ']' 2026-03-31T19:03:24.728 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: get_asok_dir 2026-03-31T19:03:24.728 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:24.728 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:24.728 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: echo /tmp/ceph-asok.65997/ceph-mon.a.asok 2026-03-31T19:03:24.729 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: local daemon_asok=/tmp/ceph-asok.65997/ceph-mon.a.asok 2026-03-31T19:03:24.729 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: CEPH_ARGS= 2026-03-31T19:03:24.729 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: ceph --format json daemon /tmp/ceph-asok.65997/ceph-mon.a.asok config get fsid 2026-03-31T19:03:24.730 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: jq -r .fsid 2026-03-31T19:03:24.778 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:478: run_mon: get_config mon a mon_host 2026-03-31T19:03:24.778 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1119: get_config: local daemon=mon 2026-03-31T19:03:24.778 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1120: get_config: local id=a 2026-03-31T19:03:24.778 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1121: get_config: local config=mon_host 2026-03-31T19:03:24.778 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: get_asok_path mon.a 2026-03-31T19:03:24.778 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name=mon.a 2026-03-31T19:03:24.778 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n mon.a ']' 2026-03-31T19:03:24.778 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: get_asok_dir 2026-03-31T19:03:24.778 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:24.778 INFO:tasks.workunit.client.0.vm05.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:24.779 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:109: get_asok_path: echo /tmp/ceph-asok.65997/ceph-mon.a.asok 2026-03-31T19:03:24.779 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1123: get_config: local daemon_asok=/tmp/ceph-asok.65997/ceph-mon.a.asok 2026-03-31T19:03:24.779 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: CEPH_ARGS= 2026-03-31T19:03:24.779 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1124: get_config: ceph --format json daemon /tmp/ceph-asok.65997/ceph-mon.a.asok config get mon_host 2026-03-31T19:03:24.779 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: jq -r .mon_host 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:92: TEST_classes: run_osd td/crush-classes 0 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-classes 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=0 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-classes/0 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-classes/0' 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-classes/0/journal' 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:24.829 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:24.830 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:03:24.830 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:03:24.830 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:03:24.830 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:03:24.830 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:03:24.830 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-31T19:03:24.830 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-31T19:03:24.830 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:03:24.830 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:03:24.830 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:03:24.830 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:03:24.830 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:03:24.830 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:03:24.830 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-classes/0 2026-03-31T19:03:24.831 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:03:24.832 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=6a5f16fb-cdc4-4558-9cce-59c387bef263 2026-03-31T19:03:24.832 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd0 6a5f16fb-cdc4-4558-9cce-59c387bef263' 2026-03-31T19:03:24.832 INFO:tasks.workunit.client.0.vm05.stdout:add osd0 6a5f16fb-cdc4-4558-9cce-59c387bef263 2026-03-31T19:03:24.832 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:03:24.844 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQB8GsxpVqtNMhAAk5p2ZmpoJE8MPmJUi0ADHg== 2026-03-31T19:03:24.844 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQB8GsxpVqtNMhAAk5p2ZmpoJE8MPmJUi0ADHg=="}' 2026-03-31T19:03:24.844 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new 6a5f16fb-cdc4-4558-9cce-59c387bef263 -i td/crush-classes/0/new.json 2026-03-31T19:03:24.954 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:03:24.961 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-classes/0/new.json 2026-03-31T19:03:24.962 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 0 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/0 --osd-journal=td/crush-classes/0/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQB8GsxpVqtNMhAAk5p2ZmpoJE8MPmJUi0ADHg== --osd-uuid 6a5f16fb-cdc4-4558-9cce-59c387bef263 2026-03-31T19:03:24.980 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:24.979+0000 7f47878a9900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:24.982 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:24.981+0000 7f47878a9900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:24.983 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:24.982+0000 7f47878a9900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:24.984 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:24.983+0000 7f47878a9900 -1 bdev(0x558d5124cc00 td/crush-classes/0/block) open stat got: (1) Operation not permitted 2026-03-31T19:03:24.984 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:24.983+0000 7f47878a9900 -1 bluestore(td/crush-classes/0) _read_fsid unparsable uuid 2026-03-31T19:03:25.421 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-classes/0/keyring 2026-03-31T19:03:25.421 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:03:25.422 INFO:tasks.workunit.client.0.vm05.stdout:adding osd0 key to auth repository 2026-03-31T19:03:25.422 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd0 key to auth repository 2026-03-31T19:03:25.422 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-classes/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:03:25.535 INFO:tasks.workunit.client.0.vm05.stdout:start osd.0 2026-03-31T19:03:25.535 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.0 2026-03-31T19:03:25.535 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 0 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/0 --osd-journal=td/crush-classes/0/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:03:25.535 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:03:25.535 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:03:25.535 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:03:25.555 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:25.553+0000 7fe8cee69900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:25.557 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:25.556+0000 7fe8cee69900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:25.558 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:25.557+0000 7fe8cee69900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:25.643 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 0 2026-03-31T19:03:25.644 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:03:25.644 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=0 2026-03-31T19:03:25.644 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:03:25.644 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:03:25.644 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:03:25.644 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:03:25.644 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:03:25.644 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:03:25.644 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.0 up' 2026-03-31T19:03:25.707 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:25.706+0000 7fe8cee69900 -1 Falling back to public interface 2026-03-31T19:03:25.750 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:03:25.832 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:25.831+0000 7fe8cee69900 -1 osd.0 0 log_to_monitors true 2026-03-31T19:03:26.752 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:03:26.752 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:03:26.752 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:03:26.752 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:03:26.752 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:03:26.752 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.0 up' 2026-03-31T19:03:26.857 INFO:tasks.workunit.client.0.vm05.stdout:osd.0 up in weight 1 up_from 4 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6800/2762164329,v1:127.0.0.1:6801/2762164329] [v2:127.0.0.1:6802/2762164329,v1:127.0.0.1:6803/2762164329] exists,up 6a5f16fb-cdc4-4558-9cce-59c387bef263 2026-03-31T19:03:26.858 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:03:26.858 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:03:26.858 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:03:26.858 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:93: TEST_classes: run_osd td/crush-classes 1 2026-03-31T19:03:26.858 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-classes 2026-03-31T19:03:26.858 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:03:26.858 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=1 2026-03-31T19:03:26.858 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:03:26.858 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-classes/1 2026-03-31T19:03:26.858 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-31T19:03:26.858 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:03:26.858 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:03:26.858 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:03:26.858 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-classes/1' 2026-03-31T19:03:26.858 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-classes/1/journal' 2026-03-31T19:03:26.858 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:03:26.858 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:03:26.858 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-31T19:03:26.859 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:03:26.859 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:03:26.859 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:03:26.859 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:03:26.859 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:26.859 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:26.859 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:03:26.859 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:03:26.859 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:03:26.859 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:03:26.859 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:03:26.859 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-31T19:03:26.860 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-31T19:03:26.860 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:03:26.860 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:03:26.860 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:03:26.860 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:03:26.860 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:03:26.860 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:03:26.860 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-classes/1 2026-03-31T19:03:26.860 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:03:26.861 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=33a97f78-bd30-40c4-ae53-c80907797f08 2026-03-31T19:03:26.861 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd1 33a97f78-bd30-40c4-ae53-c80907797f08' 2026-03-31T19:03:26.861 INFO:tasks.workunit.client.0.vm05.stdout:add osd1 33a97f78-bd30-40c4-ae53-c80907797f08 2026-03-31T19:03:26.862 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:03:26.873 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQB+Gsxp9BIHNBAAfg19bK+W1C1gjyXZGQbbxQ== 2026-03-31T19:03:26.873 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQB+Gsxp9BIHNBAAfg19bK+W1C1gjyXZGQbbxQ=="}' 2026-03-31T19:03:26.873 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new 33a97f78-bd30-40c4-ae53-c80907797f08 -i td/crush-classes/1/new.json 2026-03-31T19:03:26.986 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:03:26.992 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-classes/1/new.json 2026-03-31T19:03:26.993 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 1 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/1 --osd-journal=td/crush-classes/1/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQB+Gsxp9BIHNBAAfg19bK+W1C1gjyXZGQbbxQ== --osd-uuid 33a97f78-bd30-40c4-ae53-c80907797f08 2026-03-31T19:03:27.011 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:27.010+0000 7f6a5b771900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:27.012 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:27.011+0000 7f6a5b771900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:27.013 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:27.012+0000 7f6a5b771900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:27.013 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:27.012+0000 7f6a5b771900 -1 bdev(0x55d70c312c00 td/crush-classes/1/block) open stat got: (1) Operation not permitted 2026-03-31T19:03:27.013 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:27.012+0000 7f6a5b771900 -1 bluestore(td/crush-classes/1) _read_fsid unparsable uuid 2026-03-31T19:03:27.467 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-classes/1/keyring 2026-03-31T19:03:27.467 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:03:27.467 INFO:tasks.workunit.client.0.vm05.stdout:adding osd1 key to auth repository 2026-03-31T19:03:27.468 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd1 key to auth repository 2026-03-31T19:03:27.468 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-classes/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:03:27.577 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.1 2026-03-31T19:03:27.577 INFO:tasks.workunit.client.0.vm05.stdout:start osd.1 2026-03-31T19:03:27.577 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 1 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/1 --osd-journal=td/crush-classes/1/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:03:27.577 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:03:27.577 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:03:27.577 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:03:27.594 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:27.593+0000 7fe358cc8900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:27.596 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:27.595+0000 7fe358cc8900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:27.598 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:27.596+0000 7fe358cc8900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:27.681 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 1 2026-03-31T19:03:27.681 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:03:27.681 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=1 2026-03-31T19:03:27.681 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:03:27.681 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:03:27.681 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:03:27.681 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:03:27.681 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:03:27.681 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:03:27.681 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.1 up' 2026-03-31T19:03:27.733 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:27.731+0000 7fe358cc8900 -1 Falling back to public interface 2026-03-31T19:03:27.785 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:03:27.908 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:27.907+0000 7fe358cc8900 -1 osd.1 0 log_to_monitors true 2026-03-31T19:03:28.787 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:03:28.787 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:03:28.787 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:03:28.787 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:03:28.787 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:03:28.787 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.1 up' 2026-03-31T19:03:28.892 INFO:tasks.workunit.client.0.vm05.stdout:osd.1 up in weight 1 up_from 7 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6808/2094587354,v1:127.0.0.1:6809/2094587354] [v2:127.0.0.1:6810/2094587354,v1:127.0.0.1:6811/2094587354] exists,up 33a97f78-bd30-40c4-ae53-c80907797f08 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:94: TEST_classes: run_osd td/crush-classes 2 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:625: run_osd: local dir=td/crush-classes 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:626: run_osd: shift 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:627: run_osd: local id=2 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:628: run_osd: shift 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:629: run_osd: local osd_data=td/crush-classes/2 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:631: run_osd: local 'ceph_args=--fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false ' 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:632: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: ceph_args+=' --osd-data=td/crush-classes/2' 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: ceph_args+=' --osd-journal=td/crush-classes/2/journal' 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: ceph_args+=' --chdir=' 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:638: run_osd: ceph_args+= 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: ceph_args+=' --run-dir=td/crush-classes' 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: get_asok_path 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:107: get_asok_path: local name= 2026-03-31T19:03:28.893 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_path: '[' -n '' ']' 2026-03-31T19:03:28.894 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: get_asok_dir 2026-03-31T19:03:28.894 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:28.894 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:28.894 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_path: echo '/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:03:28.894 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' 2026-03-31T19:03:28.894 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --debug-osd=20' 2026-03-31T19:03:28.894 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --debug-ms=1' 2026-03-31T19:03:28.894 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --debug-monc=20' 2026-03-31T19:03:28.894 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --log-file=td/crush-classes/$name.log' 2026-03-31T19:03:28.894 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --pid-file=td/crush-classes/$name.pid' 2026-03-31T19:03:28.894 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-31T19:03:28.894 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-31T19:03:28.894 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-31T19:03:28.894 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-31T19:03:28.894 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' ' 2026-03-31T19:03:28.894 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+= 2026-03-31T19:03:28.894 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: mkdir -p td/crush-classes/2 2026-03-31T19:03:28.895 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: uuidgen 2026-03-31T19:03:28.896 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: local uuid=e2d86cb0-026e-4c74-98bf-1cc26aac9be5 2026-03-31T19:03:28.897 INFO:tasks.workunit.client.0.vm05.stdout:add osd2 e2d86cb0-026e-4c74-98bf-1cc26aac9be5 2026-03-31T19:03:28.897 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: echo 'add osd2 e2d86cb0-026e-4c74-98bf-1cc26aac9be5' 2026-03-31T19:03:28.897 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph-authtool --gen-print-key 2026-03-31T19:03:28.913 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: OSD_SECRET=AQCAGsxpImVaNhAAhPemSSuQxMU4Hy+nRceZtw== 2026-03-31T19:03:28.913 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: echo '{"cephx_secret": "AQCAGsxpImVaNhAAhPemSSuQxMU4Hy+nRceZtw=="}' 2026-03-31T19:03:28.913 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph osd new e2d86cb0-026e-4c74-98bf-1cc26aac9be5 -i td/crush-classes/2/new.json 2026-03-31T19:03:29.037 INFO:tasks.workunit.client.0.vm05.stdout:2 2026-03-31T19:03:29.045 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: rm td/crush-classes/2/new.json 2026-03-31T19:03:29.046 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: ceph-osd -i 2 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/2 --osd-journal=td/crush-classes/2/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCAGsxpImVaNhAAhPemSSuQxMU4Hy+nRceZtw== --osd-uuid e2d86cb0-026e-4c74-98bf-1cc26aac9be5 2026-03-31T19:03:29.064 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:29.063+0000 7eff19ea6900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:29.065 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:29.064+0000 7eff19ea6900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:29.066 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:29.065+0000 7eff19ea6900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:29.067 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:29.066+0000 7eff19ea6900 -1 bdev(0x564193dd8c00 td/crush-classes/2/block) open stat got: (1) Operation not permitted 2026-03-31T19:03:29.067 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:29.066+0000 7eff19ea6900 -1 bluestore(td/crush-classes/2) _read_fsid unparsable uuid 2026-03-31T19:03:29.560 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local key_fn=td/crush-classes/2/keyring 2026-03-31T19:03:29.560 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: cat 2026-03-31T19:03:29.561 INFO:tasks.workunit.client.0.vm05.stdout:adding osd2 key to auth repository 2026-03-31T19:03:29.561 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: echo adding osd2 key to auth repository 2026-03-31T19:03:29.561 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph -i td/crush-classes/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-31T19:03:29.683 INFO:tasks.workunit.client.0.vm05.stdout:start osd.2 2026-03-31T19:03:29.684 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:669: run_osd: echo start osd.2 2026-03-31T19:03:29.684 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: ceph-osd -i 2 --fsid=be34b6ca-f30f-4e31-a25d-6762c275d7e2 --auth-supported=none --mon-host=127.0.0.1:7130 --osd-class-update-on-start=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/crush-classes/2 --osd-journal=td/crush-classes/2/journal --chdir= --run-dir=td/crush-classes '--admin-socket=/tmp/ceph-asok.65997/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/crush-classes/$name.log' '--pid-file=td/crush-classes/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-31T19:03:29.684 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: ceph osd dump --format=json 2026-03-31T19:03:29.684 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: jq '.flags_set[]' 2026-03-31T19:03:29.684 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:673: run_osd: grep -q '"noup"' 2026-03-31T19:03:29.705 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:29.704+0000 7fcf419a0900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:29.707 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:29.706+0000 7fcf419a0900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:29.708 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:29.707+0000 7fcf419a0900 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-31T19:03:29.793 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: wait_for_osd up 2 2026-03-31T19:03:29.793 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:972: wait_for_osd: local state=up 2026-03-31T19:03:29.793 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:03:29.793 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:973: wait_for_osd: local id=2 2026-03-31T19:03:29.793 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:975: wait_for_osd: status=1 2026-03-31T19:03:29.794 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i=0 )) 2026-03-31T19:03:29.794 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:03:29.794 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 0 2026-03-31T19:03:29.794 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:03:29.794 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.2 up' 2026-03-31T19:03:29.844 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:29.843+0000 7fcf419a0900 -1 Falling back to public interface 2026-03-31T19:03:29.903 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: sleep 1 2026-03-31T19:03:30.014 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:30.013+0000 7fcf419a0900 -1 osd.2 0 log_to_monitors true 2026-03-31T19:03:30.611 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-31T19:03:30.610+0000 7fcf3d940640 -1 osd.2 0 waiting for initial osdmap 2026-03-31T19:03:30.904 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:03:30.905 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i++ )) 2026-03-31T19:03:30.905 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:976: wait_for_osd: (( i < 300 )) 2026-03-31T19:03:30.905 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:977: wait_for_osd: echo 1 2026-03-31T19:03:30.905 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: ceph osd dump 2026-03-31T19:03:30.905 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: grep 'osd.2 up' 2026-03-31T19:03:31.014 INFO:tasks.workunit.client.0.vm05.stdout:osd.2 up in weight 1 up_from 11 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6816/876139031,v1:127.0.0.1:6817/876139031] [v2:127.0.0.1:6818/876139031,v1:127.0.0.1:6819/876139031] exists,up e2d86cb0-026e-4c74-98bf-1cc26aac9be5 2026-03-31T19:03:31.014 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=0 2026-03-31T19:03:31.014 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: break 2026-03-31T19:03:31.014 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: return 0 2026-03-31T19:03:31.014 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:95: TEST_classes: create_rbd_pool 2026-03-31T19:03:31.014 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:527: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-31T19:03:31.123 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd' does not exist 2026-03-31T19:03:31.131 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:528: create_rbd_pool: create_pool rbd 4 2026-03-31T19:03:31.131 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:533: create_pool: ceph osd pool create rbd 4 2026-03-31T19:03:31.294 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd' already exists 2026-03-31T19:03:31.302 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:534: create_pool: sleep 1 2026-03-31T19:03:32.303 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:529: create_rbd_pool: rbd pool init rbd 2026-03-31T19:03:32.594 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:97: TEST_classes: get_osds_up rbd SOMETHING 2026-03-31T19:03:32.594 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:51: get_osds_up: local poolname=rbd 2026-03-31T19:03:32.594 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:52: get_osds_up: local objectname=SOMETHING 2026-03-31T19:03:32.594 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: ceph --format xml osd map rbd SOMETHING 2026-03-31T19:03:32.594 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: xmlstarlet sel -t -m //up/osd -v . -o ' ' 2026-03-31T19:03:32.707 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: local 'osds=1 2 0 ' 2026-03-31T19:03:32.708 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:57: get_osds_up: echo 1 2 0 2026-03-31T19:03:32.708 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:97: TEST_classes: test '1 2 0' == '1 2 0' 2026-03-31T19:03:32.708 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:98: TEST_classes: add_something td/crush-classes SOMETHING 2026-03-31T19:03:32.708 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:42: add_something: local dir=td/crush-classes 2026-03-31T19:03:32.708 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:43: add_something: local obj=SOMETHING 2026-03-31T19:03:32.708 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:45: add_something: local payload=ABCDEF 2026-03-31T19:03:32.708 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:46: add_something: echo ABCDEF 2026-03-31T19:03:32.708 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:47: add_something: rados --pool rbd put SOMETHING td/crush-classes/ORIGINAL 2026-03-31T19:03:32.732 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:104: TEST_classes: ceph osd getcrushmap 2026-03-31T19:03:32.835 INFO:tasks.workunit.client.0.vm05.stderr:4 2026-03-31T19:03:32.842 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:105: TEST_classes: crushtool -d td/crush-classes/map -o td/crush-classes/map.txt 2026-03-31T19:03:32.854 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:106: TEST_classes: sed -i -e '/device 0 osd.0/s/$/ class ssd/' -e '/step take default/s/$/ class ssd/' td/crush-classes/map.txt 2026-03-31T19:03:32.856 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:110: TEST_classes: crushtool -c td/crush-classes/map.txt -o td/crush-classes/map-new 2026-03-31T19:03:32.867 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:111: TEST_classes: ceph osd setcrushmap -i td/crush-classes/map-new 2026-03-31T19:03:33.095 INFO:tasks.workunit.client.0.vm05.stderr:6 2026-03-31T19:03:33.106 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:117: TEST_classes: ok=false 2026-03-31T19:03:33.107 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:118: TEST_classes: for delay in 2 4 8 16 32 64 128 256 2026-03-31T19:03:33.107 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:119: TEST_classes: get_osds_up rbd SOMETHING_ELSE 2026-03-31T19:03:33.107 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:51: get_osds_up: local poolname=rbd 2026-03-31T19:03:33.107 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:52: get_osds_up: local objectname=SOMETHING_ELSE 2026-03-31T19:03:33.107 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: ceph --format xml osd map rbd SOMETHING_ELSE 2026-03-31T19:03:33.107 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: xmlstarlet sel -t -m //up/osd -v . -o ' ' 2026-03-31T19:03:33.218 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: local 'osds=0 ' 2026-03-31T19:03:33.218 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:57: get_osds_up: echo 0 2026-03-31T19:03:33.218 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:119: TEST_classes: test 0 == 0 2026-03-31T19:03:33.218 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:120: TEST_classes: ok=true 2026-03-31T19:03:33.218 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:121: TEST_classes: break 2026-03-31T19:03:33.218 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:127: TEST_classes: true 2026-03-31T19:03:33.219 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:132: TEST_classes: add_something td/crush-classes SOMETHING_ELSE 2026-03-31T19:03:33.219 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:42: add_something: local dir=td/crush-classes 2026-03-31T19:03:33.219 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:43: add_something: local obj=SOMETHING_ELSE 2026-03-31T19:03:33.219 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:45: add_something: local payload=ABCDEF 2026-03-31T19:03:33.219 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:46: add_something: echo ABCDEF 2026-03-31T19:03:33.219 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:47: add_something: rados --pool rbd put SOMETHING_ELSE td/crush-classes/ORIGINAL 2026-03-31T19:03:33.241 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:138: TEST_classes: ceph osd crush dump 2026-03-31T19:03:33.241 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:138: TEST_classes: grep -q '~ssd' 2026-03-31T19:03:33.348 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:146: TEST_set_device_class: ceph osd crush set-device-class ssd osd.0 2026-03-31T19:03:33.609 INFO:tasks.workunit.client.0.vm05.stderr:osd.0 already set to class ssd. set-device-class item id 0 name 'osd.0' device_class 'ssd': no change. set osd(s) to class 'ssd' 2026-03-31T19:03:33.617 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:147: TEST_set_device_class: ceph osd crush class ls-osd ssd 2026-03-31T19:03:33.617 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:147: TEST_set_device_class: grep 0 2026-03-31T19:03:33.730 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-31T19:03:33.730 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:148: TEST_set_device_class: ceph osd crush set-device-class ssd osd.1 2026-03-31T19:03:33.917 INFO:tasks.workunit.client.0.vm05.stderr:osd.1 already set to class ssd. set-device-class item id 1 name 'osd.1' device_class 'ssd': no change. set osd(s) to class 'ssd' 2026-03-31T19:03:33.930 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:149: TEST_set_device_class: ceph osd crush class ls-osd ssd 2026-03-31T19:03:33.930 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:149: TEST_set_device_class: grep 1 2026-03-31T19:03:34.041 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-31T19:03:34.041 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:150: TEST_set_device_class: ceph osd crush set-device-class ssd 0 1 2026-03-31T19:03:34.231 INFO:tasks.workunit.client.0.vm05.stderr:osd.0 already set to class ssd. set-device-class item id 0 name 'osd.0' device_class 'ssd': no change. osd.1 already set to class ssd. set-device-class item id 1 name 'osd.1' device_class 'ssd': no change. set osd(s) to class 'ssd' 2026-03-31T19:03:34.239 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:152: TEST_set_device_class: ok=false 2026-03-31T19:03:34.239 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:153: TEST_set_device_class: for delay in 2 4 8 16 32 64 128 256 2026-03-31T19:03:34.239 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:154: TEST_set_device_class: get_osds_up rbd SOMETHING_ELSE 2026-03-31T19:03:34.239 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:51: get_osds_up: local poolname=rbd 2026-03-31T19:03:34.239 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:52: get_osds_up: local objectname=SOMETHING_ELSE 2026-03-31T19:03:34.240 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: ceph --format xml osd map rbd SOMETHING_ELSE 2026-03-31T19:03:34.240 INFO:tasks.workunit.client.0.vm05.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: xmlstarlet sel -t -m //up/osd -v . -o ' ' 2026-03-31T19:03:34.344 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:55: get_osds_up: local 'osds=0 1 ' 2026-03-31T19:03:34.344 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:57: get_osds_up: echo 0 1 2026-03-31T19:03:34.344 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:154: TEST_set_device_class: test '0 1' == '0 1' 2026-03-31T19:03:34.344 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:155: TEST_set_device_class: ok=true 2026-03-31T19:03:34.344 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:156: TEST_set_device_class: break 2026-03-31T19:03:34.344 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:163: TEST_set_device_class: true 2026-03-31T19:03:34.344 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/crush/crush-classes.sh:37: run: teardown td/crush-classes 2026-03-31T19:03:34.344 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:155: teardown: local dir=td/crush-classes 2026-03-31T19:03:34.344 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:156: teardown: local dumplogs= 2026-03-31T19:03:34.344 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:157: teardown: kill_daemons td/crush-classes KILL 2026-03-31T19:03:34.344 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:03:34.344 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:03:34.345 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:03:34.345 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:03:34.345 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:03:34.452 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:03:34.452 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: uname 2026-03-31T19:03:34.453 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-31T19:03:34.453 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: stat -f -c %T . 2026-03-31T19:03:34.454 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: '[' xfs == btrfs ']' 2026-03-31T19:03:34.454 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:162: teardown: local cores=no 2026-03-31T19:03:34.454 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: sysctl -n kernel.core_pattern 2026-03-31T19:03:34.455 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:03:34.455 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: '[' / = '|' ']' 2026-03-31T19:03:34.455 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: grep -q '^core\|core$' 2026-03-31T19:03:34.456 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:03:34.456 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:03:34.457 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: '[' no = yes -o '' = 1 ']' 2026-03-31T19:03:34.457 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: rm -fr td/crush-classes 2026-03-31T19:03:34.466 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: get_asok_dir 2026-03-31T19:03:34.467 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:34.467 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:34.467 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: rm -rf /tmp/ceph-asok.65997 2026-03-31T19:03:34.467 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:191: teardown: '[' no = yes ']' 2026-03-31T19:03:34.467 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: return 0 2026-03-31T19:03:34.467 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2405: main: code=0 2026-03-31T19:03:34.468 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2409: main: teardown td/crush-classes 0 2026-03-31T19:03:34.468 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:155: teardown: local dir=td/crush-classes 2026-03-31T19:03:34.468 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:156: teardown: local dumplogs=0 2026-03-31T19:03:34.468 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:157: teardown: kill_daemons td/crush-classes KILL 2026-03-31T19:03:34.468 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: shopt -q -o xtrace 2026-03-31T19:03:34.468 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: echo true 2026-03-31T19:03:34.468 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:336: kill_daemons: local trace=true 2026-03-31T19:03:34.468 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: true 2026-03-31T19:03:34.468 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:337: kill_daemons: shopt -u -o xtrace 2026-03-31T19:03:34.470 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:353: kill_daemons: return 0 2026-03-31T19:03:34.470 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: uname 2026-03-31T19:03:34.471 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:158: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-31T19:03:34.471 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: stat -f -c %T . 2026-03-31T19:03:34.472 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:159: teardown: '[' xfs == btrfs ']' 2026-03-31T19:03:34.472 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:162: teardown: local cores=no 2026-03-31T19:03:34.472 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: sysctl -n kernel.core_pattern 2026-03-31T19:03:34.472 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:163: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:03:34.472 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: '[' / = '|' ']' 2026-03-31T19:03:34.473 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: grep -q '^core\|core$' 2026-03-31T19:03:34.473 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T19:03:34.474 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:03:34.474 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: '[' no = yes -o 0 = 1 ']' 2026-03-31T19:03:34.474 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: rm -fr td/crush-classes 2026-03-31T19:03:34.475 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: get_asok_dir 2026-03-31T19:03:34.475 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:99: get_asok_dir: '[' -n '' ']' 2026-03-31T19:03:34.475 INFO:tasks.workunit.client.0.vm05.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:102: get_asok_dir: echo /tmp/ceph-asok.65997 2026-03-31T19:03:34.476 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: rm -rf /tmp/ceph-asok.65997 2026-03-31T19:03:34.476 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:191: teardown: '[' no = yes ']' 2026-03-31T19:03:34.476 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: return 0 2026-03-31T19:03:34.476 INFO:tasks.workunit.client.0.vm05.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2410: main: return 0 2026-03-31T19:03:34.477 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-31T19:03:34.477 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-31T19:03:34.538 INFO:tasks.workunit:Stopping ['crush'] on client.0... 2026-03-31T19:03:34.538 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-31T19:03:34.944 DEBUG:teuthology.parallel:result is None 2026-03-31T19:03:34.944 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-31T19:03:34.966 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-31T19:03:34.966 DEBUG:teuthology.orchestra.run.vm05:> rmdir -- /home/ubuntu/cephtest/mnt.0 2026-03-31T19:03:35.019 INFO:tasks.workunit:Deleted artificial mount point /home/ubuntu/cephtest/mnt.0/client.0 2026-03-31T19:03:35.019 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-31T19:03:35.021 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-31T19:03:35.021 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-31T19:03:35.085 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, ceph-volume, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-31T19:03:35.085 DEBUG:teuthology.orchestra.run.vm05:> 2026-03-31T19:03:35.085 DEBUG:teuthology.orchestra.run.vm05:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse ceph-volume librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-31T19:03:35.085 DEBUG:teuthology.orchestra.run.vm05:> sudo yum -y remove $d || true 2026-03-31T19:03:35.085 DEBUG:teuthology.orchestra.run.vm05:> done 2026-03-31T19:03:35.275 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:35.275 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:35.275 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-31T19:03:35.275 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:35.275 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-31T19:03:35.275 INFO:teuthology.orchestra.run.vm05.stdout: ceph-radosgw x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 103 M 2026-03-31T19:03:35.275 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-31T19:03:35.276 INFO:teuthology.orchestra.run.vm05.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-31T19:03:35.276 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:35.276 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-31T19:03:35.276 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:35.276 INFO:teuthology.orchestra.run.vm05.stdout:Remove 2 Packages 2026-03-31T19:03:35.276 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:35.276 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 103 M 2026-03-31T19:03:35.276 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-31T19:03:35.278 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-31T19:03:35.278 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-31T19:03:35.294 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-31T19:03:35.294 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-31T19:03:35.328 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-31T19:03:35.345 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-radosgw-2:20.2.0-721.g5bb32787.el9.x86_64 1/2 2026-03-31T19:03:35.345 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-31T19:03:35.345 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-31T19:03:35.345 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-31T19:03:35.345 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-31T19:03:35.345 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:35.352 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-radosgw-2:20.2.0-721.g5bb32787.el9.x86_64 1/2 2026-03-31T19:03:35.359 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-radosgw-2:20.2.0-721.g5bb32787.el9.x86_64 1/2 2026-03-31T19:03:35.375 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-31T19:03:35.436 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-31T19:03:35.436 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-radosgw-2:20.2.0-721.g5bb32787.el9.x86_64 1/2 2026-03-31T19:03:35.478 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-31T19:03:35.478 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:35.478 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-31T19:03:35.478 INFO:teuthology.orchestra.run.vm05.stdout: ceph-radosgw-2:20.2.0-721.g5bb32787.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-31T19:03:35.478 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:35.478 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:35.653 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:35.653 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:35.653 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-31T19:03:35.653 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:35.653 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-31T19:03:35.653 INFO:teuthology.orchestra.run.vm05.stdout: ceph-test x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 362 M 2026-03-31T19:03:35.653 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-31T19:03:35.653 INFO:teuthology.orchestra.run.vm05.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-31T19:03:35.653 INFO:teuthology.orchestra.run.vm05.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-31T19:03:35.653 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:35.653 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-31T19:03:35.653 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:35.653 INFO:teuthology.orchestra.run.vm05.stdout:Remove 3 Packages 2026-03-31T19:03:35.653 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:35.653 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 363 M 2026-03-31T19:03:35.654 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-31T19:03:35.656 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-31T19:03:35.656 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-31T19:03:35.677 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-31T19:03:35.677 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-31T19:03:35.745 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-31T19:03:35.751 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-test-2:20.2.0-721.g5bb32787.el9.x86_64 1/3 2026-03-31T19:03:35.753 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 2/3 2026-03-31T19:03:35.767 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 3/3 2026-03-31T19:03:35.826 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: xmlstarlet-1.6.1-20.el9.x86_64 3/3 2026-03-31T19:03:35.826 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-test-2:20.2.0-721.g5bb32787.el9.x86_64 1/3 2026-03-31T19:03:35.826 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 2/3 2026-03-31T19:03:35.867 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 3/3 2026-03-31T19:03:35.868 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:35.868 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-31T19:03:35.868 INFO:teuthology.orchestra.run.vm05.stdout: ceph-test-2:20.2.0-721.g5bb32787.el9.x86_64 socat-1.7.4.1-8.el9.x86_64 2026-03-31T19:03:35.868 INFO:teuthology.orchestra.run.vm05.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-31T19:03:35.868 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:35.868 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:36.037 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:36.038 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:36.038 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-31T19:03:36.039 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:36.039 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-31T19:03:36.039 INFO:teuthology.orchestra.run.vm05.stdout: ceph x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 0 2026-03-31T19:03:36.039 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-31T19:03:36.039 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mds x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 6.8 M 2026-03-31T19:03:36.039 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mon x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 19 M 2026-03-31T19:03:36.039 INFO:teuthology.orchestra.run.vm05.stdout: lua x86_64 5.4.4-4.el9 @appstream 593 k 2026-03-31T19:03:36.039 INFO:teuthology.orchestra.run.vm05.stdout: lua-devel x86_64 5.4.4-4.el9 @crb 49 k 2026-03-31T19:03:36.039 INFO:teuthology.orchestra.run.vm05.stdout: luarocks noarch 3.9.2-5.el9 @epel 692 k 2026-03-31T19:03:36.039 INFO:teuthology.orchestra.run.vm05.stdout: unzip x86_64 6.0-59.el9 @baseos 389 k 2026-03-31T19:03:36.039 INFO:teuthology.orchestra.run.vm05.stdout: zip x86_64 3.0-35.el9 @baseos 724 k 2026-03-31T19:03:36.039 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:36.039 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-31T19:03:36.039 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:36.039 INFO:teuthology.orchestra.run.vm05.stdout:Remove 8 Packages 2026-03-31T19:03:36.039 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:36.039 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 28 M 2026-03-31T19:03:36.039 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-31T19:03:36.041 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-31T19:03:36.041 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-31T19:03:36.063 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-31T19:03:36.063 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-31T19:03:36.101 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-31T19:03:36.106 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-2:20.2.0-721.g5bb32787.el9.x86_64 1/8 2026-03-31T19:03:36.109 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : luarocks-3.9.2-5.el9.noarch 2/8 2026-03-31T19:03:36.110 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : lua-devel-5.4.4-4.el9.x86_64 3/8 2026-03-31T19:03:36.113 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : zip-3.0-35.el9.x86_64 4/8 2026-03-31T19:03:36.115 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : unzip-6.0-59.el9.x86_64 5/8 2026-03-31T19:03:36.117 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : lua-5.4.4-4.el9.x86_64 6/8 2026-03-31T19:03:36.136 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mds-2:20.2.0-721.g5bb32787.el9.x86_64 7/8 2026-03-31T19:03:36.136 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-31T19:03:36.136 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-31T19:03:36.136 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-31T19:03:36.136 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-31T19:03:36.136 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:36.137 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mds-2:20.2.0-721.g5bb32787.el9.x86_64 7/8 2026-03-31T19:03:36.142 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mds-2:20.2.0-721.g5bb32787.el9.x86_64 7/8 2026-03-31T19:03:36.160 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mon-2:20.2.0-721.g5bb32787.el9.x86_64 8/8 2026-03-31T19:03:36.160 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-31T19:03:36.160 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-31T19:03:36.160 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-31T19:03:36.160 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-31T19:03:36.160 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:36.162 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mon-2:20.2.0-721.g5bb32787.el9.x86_64 8/8 2026-03-31T19:03:36.236 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mon-2:20.2.0-721.g5bb32787.el9.x86_64 8/8 2026-03-31T19:03:36.236 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-2:20.2.0-721.g5bb32787.el9.x86_64 1/8 2026-03-31T19:03:36.236 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mds-2:20.2.0-721.g5bb32787.el9.x86_64 2/8 2026-03-31T19:03:36.236 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mon-2:20.2.0-721.g5bb32787.el9.x86_64 3/8 2026-03-31T19:03:36.236 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : lua-5.4.4-4.el9.x86_64 4/8 2026-03-31T19:03:36.236 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : lua-devel-5.4.4-4.el9.x86_64 5/8 2026-03-31T19:03:36.236 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : luarocks-3.9.2-5.el9.noarch 6/8 2026-03-31T19:03:36.236 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : unzip-6.0-59.el9.x86_64 7/8 2026-03-31T19:03:36.280 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : zip-3.0-35.el9.x86_64 8/8 2026-03-31T19:03:36.280 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:36.280 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-31T19:03:36.280 INFO:teuthology.orchestra.run.vm05.stdout: ceph-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:36.280 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mds-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:36.280 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mon-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:36.280 INFO:teuthology.orchestra.run.vm05.stdout: lua-5.4.4-4.el9.x86_64 2026-03-31T19:03:36.280 INFO:teuthology.orchestra.run.vm05.stdout: lua-devel-5.4.4-4.el9.x86_64 2026-03-31T19:03:36.280 INFO:teuthology.orchestra.run.vm05.stdout: luarocks-3.9.2-5.el9.noarch 2026-03-31T19:03:36.280 INFO:teuthology.orchestra.run.vm05.stdout: unzip-6.0-59.el9.x86_64 2026-03-31T19:03:36.280 INFO:teuthology.orchestra.run.vm05.stdout: zip-3.0-35.el9.x86_64 2026-03-31T19:03:36.280 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:36.280 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:36.455 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout:=========================================================================================== 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout:=========================================================================================== 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: ceph-base x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 24 M 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout:Removing dependent packages: 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: ceph-immutable-object-cache x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 447 k 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 2.9 M 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-cephadm noarch 2:20.2.0-721.g5bb32787.el9 @ceph-noarch 940 k 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-dashboard noarch 2:20.2.0-721.g5bb32787.el9 @ceph-noarch 140 M 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-diskprediction-local noarch 2:20.2.0-721.g5bb32787.el9 @ceph-noarch 66 M 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-rook noarch 2:20.2.0-721.g5bb32787.el9 @ceph-noarch 567 k 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: ceph-osd x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 54 M 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: ceph-volume noarch 2:20.2.0-721.g5bb32787.el9 @ceph-noarch 1.4 M 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: rbd-mirror x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 11 M 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: abseil-cpp x86_64 20211102.0-4.el9 @epel 1.9 M 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: ceph-common x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 98 M 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: ceph-grafana-dashboards noarch 2:20.2.0-721.g5bb32787.el9 @ceph-noarch 996 k 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-k8sevents noarch 2:20.2.0-721.g5bb32787.el9 @ceph-noarch 60 k 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core noarch 2:20.2.0-721.g5bb32787.el9 @ceph-noarch 1.6 M 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: ceph-prometheus-alerts noarch 2:20.2.0-721.g5bb32787.el9 @ceph-noarch 59 k 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: ceph-selinux x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 138 k 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: cryptsetup x86_64 2.8.1-3.el9 @baseos 770 k 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: grpc-data noarch 1.46.7-10.el9 @epel 13 k 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: libcephsqlite x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 409 k 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: libradosstriper1 x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 792 k 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-31T19:03:36.461 INFO:teuthology.orchestra.run.vm05.stdout: nvme-cli x86_64 2.16-1.el9 @baseos 7.0 M 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: pciutils x86_64 3.7.0-7.el9 @baseos 216 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: protobuf x86_64 3.14.0-17.el9 @appstream 3.5 M 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: protobuf-compiler x86_64 3.14.0-17.el9 @crb 2.9 M 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-common x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 855 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-cheroot noarch 10.0.1-5.el9 @epel 682 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy noarch 18.10.0-5.el9 @epel 1.0 M 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-grpcio x86_64 1.46.7-10.el9 @epel 6.7 M 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-grpcio-tools x86_64 1.46.7-10.el9 @epel 418 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-influxdb noarch 5.3.1-1.el9 @epel 747 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-isodate noarch 0.6.1-3.el9 @epel 203 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-lxml x86_64 4.6.5-3.el9 @appstream 4.2 M 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-msgpack x86_64 1.0.3-2.el9 @epel 264 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-packaging noarch 20.9-5.el9 @appstream 248 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-protobuf noarch 3.14.0-17.el9 @appstream 1.4 M 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyparsing noarch 2.4.7-9.el9 @baseos 635 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-saml noarch 1.16.0-1.el9 @epel 730 k 2026-03-31T19:03:36.462 INFO:teuthology.orchestra.run.vm05.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-31T19:03:36.463 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-31T19:03:36.463 INFO:teuthology.orchestra.run.vm05.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-31T19:03:36.463 INFO:teuthology.orchestra.run.vm05.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-31T19:03:36.463 INFO:teuthology.orchestra.run.vm05.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-31T19:03:36.463 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-31T19:03:36.463 INFO:teuthology.orchestra.run.vm05.stdout: python3-xmlsec x86_64 1.3.13-1.el9 @epel 158 k 2026-03-31T19:03:36.463 INFO:teuthology.orchestra.run.vm05.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-31T19:03:36.463 INFO:teuthology.orchestra.run.vm05.stdout: qatlib x86_64 25.08.0-2.el9 @appstream 639 k 2026-03-31T19:03:36.463 INFO:teuthology.orchestra.run.vm05.stdout: qatlib-service x86_64 25.08.0-2.el9 @appstream 69 k 2026-03-31T19:03:36.463 INFO:teuthology.orchestra.run.vm05.stdout: qatzip-libs x86_64 1.3.1-1.el9 @appstream 148 k 2026-03-31T19:03:36.463 INFO:teuthology.orchestra.run.vm05.stdout: smartmontools x86_64 1:7.2-10.el9 @baseos 1.9 M 2026-03-31T19:03:36.463 INFO:teuthology.orchestra.run.vm05.stdout: xmlsec1 x86_64 1.2.29-13.el9 @appstream 596 k 2026-03-31T19:03:36.463 INFO:teuthology.orchestra.run.vm05.stdout: xmlsec1-openssl x86_64 1.2.29-13.el9 @appstream 281 k 2026-03-31T19:03:36.463 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:36.463 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-31T19:03:36.463 INFO:teuthology.orchestra.run.vm05.stdout:=========================================================================================== 2026-03-31T19:03:36.463 INFO:teuthology.orchestra.run.vm05.stdout:Remove 110 Packages 2026-03-31T19:03:36.463 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:36.463 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 683 M 2026-03-31T19:03:36.463 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-31T19:03:36.489 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-31T19:03:36.489 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-31T19:03:36.589 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-31T19:03:36.590 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-31T19:03:36.728 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-31T19:03:36.728 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-rook-2:20.2.0-721.g5bb32787.el9.noarch 1/110 2026-03-31T19:03:36.735 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-rook-2:20.2.0-721.g5bb32787.el9.noarch 1/110 2026-03-31T19:03:36.752 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-2:20.2.0-721.g5bb32787.el9.x86_64 2/110 2026-03-31T19:03:36.752 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-31T19:03:36.752 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-31T19:03:36.752 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-31T19:03:36.752 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-31T19:03:36.752 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:36.752 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-2:20.2.0-721.g5bb32787.el9.x86_64 2/110 2026-03-31T19:03:36.763 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-2:20.2.0-721.g5bb32787.el9.x86_64 2/110 2026-03-31T19:03:36.817 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-modules-core-2:20.2.0-721.g5bb32787.el9 3/110 2026-03-31T19:03:36.817 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-dashboard-2:20.2.0-721.g5bb32787.el9.no 4/110 2026-03-31T19:03:36.830 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-dashboard-2:20.2.0-721.g5bb32787.el9.no 4/110 2026-03-31T19:03:36.834 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-influxdb-5.3.1-1.el9.noarch 5/110 2026-03-31T19:03:36.834 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-cephadm-2:20.2.0-721.g5bb32787.el9.noar 6/110 2026-03-31T19:03:36.844 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-cephadm-2:20.2.0-721.g5bb32787.el9.noar 6/110 2026-03-31T19:03:36.849 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cherrypy-18.10.0-5.el9.noarch 7/110 2026-03-31T19:03:36.853 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cheroot-10.0.1-5.el9.noarch 8/110 2026-03-31T19:03:36.861 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-grpcio-tools-1.46.7-10.el9.x86_64 9/110 2026-03-31T19:03:36.864 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-grpcio-1.46.7-10.el9.x86_64 10/110 2026-03-31T19:03:36.884 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-osd-2:20.2.0-721.g5bb32787.el9.x86_64 11/110 2026-03-31T19:03:36.884 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-31T19:03:36.884 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-31T19:03:36.884 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-31T19:03:36.884 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-31T19:03:36.884 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:36.889 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-osd-2:20.2.0-721.g5bb32787.el9.x86_64 11/110 2026-03-31T19:03:36.895 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-osd-2:20.2.0-721.g5bb32787.el9.x86_64 11/110 2026-03-31T19:03:36.909 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-volume-2:20.2.0-721.g5bb32787.el9.noarch 12/110 2026-03-31T19:03:36.909 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-31T19:03:36.909 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-volume@*.service" escaped as "ceph-volume@\x2a.service". 2026-03-31T19:03:36.909 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:36.917 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-volume-2:20.2.0-721.g5bb32787.el9.noarch 12/110 2026-03-31T19:03:36.924 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-volume-2:20.2.0-721.g5bb32787.el9.noarch 12/110 2026-03-31T19:03:36.926 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 13/110 2026-03-31T19:03:36.930 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 14/110 2026-03-31T19:03:36.934 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 15/110 2026-03-31T19:03:36.961 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-saml-1.16.0-1.el9.noarch 16/110 2026-03-31T19:03:36.966 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 17/110 2026-03-31T19:03:36.969 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 18/110 2026-03-31T19:03:36.977 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 19/110 2026-03-31T19:03:36.986 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 20/110 2026-03-31T19:03:36.986 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-diskprediction-local-2:20.2.0-721.g5bb3 21/110 2026-03-31T19:03:36.992 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:20.2.0-721.g5bb3 21/110 2026-03-31T19:03:37.078 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 22/110 2026-03-31T19:03:37.091 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 23/110 2026-03-31T19:03:37.096 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-xmlsec-1.3.13-1.el9.x86_64 24/110 2026-03-31T19:03:37.099 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-lxml-4.6.5-3.el9.x86_64 25/110 2026-03-31T19:03:37.111 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 26/110 2026-03-31T19:03:37.111 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-31T19:03:37.111 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:37.112 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 26/110 2026-03-31T19:03:37.135 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 26/110 2026-03-31T19:03:37.139 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 27/110 2026-03-31T19:03:37.140 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : xmlsec1-openssl-1.2.29-13.el9.x86_64 28/110 2026-03-31T19:03:37.152 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : xmlsec1-1.2.29-13.el9.x86_64 29/110 2026-03-31T19:03:37.157 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 30/110 2026-03-31T19:03:37.159 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : protobuf-compiler-3.14.0-17.el9.x86_64 31/110 2026-03-31T19:03:37.161 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 32/110 2026-03-31T19:03:37.180 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: rbd-mirror-2:20.2.0-721.g5bb32787.el9.x86_64 33/110 2026-03-31T19:03:37.180 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-31T19:03:37.180 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-31T19:03:37.180 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-31T19:03:37.180 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-31T19:03:37.180 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:37.181 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : rbd-mirror-2:20.2.0-721.g5bb32787.el9.x86_64 33/110 2026-03-31T19:03:37.187 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: rbd-mirror-2:20.2.0-721.g5bb32787.el9.x86_64 33/110 2026-03-31T19:03:37.190 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 34/110 2026-03-31T19:03:37.193 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-packaging-20.9-5.el9.noarch 35/110 2026-03-31T19:03:37.195 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 36/110 2026-03-31T19:03:37.197 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 37/110 2026-03-31T19:03:37.200 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 38/110 2026-03-31T19:03:37.202 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 39/110 2026-03-31T19:03:37.202 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-k8sevents-2:20.2.0-721.g5bb32787.el9.no 40/110 2026-03-31T19:03:37.250 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-k8sevents-2:20.2.0-721.g5bb32787.el9.no 40/110 2026-03-31T19:03:37.257 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 41/110 2026-03-31T19:03:37.260 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 42/110 2026-03-31T19:03:37.268 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 43/110 2026-03-31T19:03:37.272 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 44/110 2026-03-31T19:03:37.281 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 45/110 2026-03-31T19:03:37.287 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 46/110 2026-03-31T19:03:37.290 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 47/110 2026-03-31T19:03:37.295 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 48/110 2026-03-31T19:03:37.339 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 49/110 2026-03-31T19:03:37.348 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 50/110 2026-03-31T19:03:37.350 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 51/110 2026-03-31T19:03:37.355 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 52/110 2026-03-31T19:03:37.356 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 53/110 2026-03-31T19:03:37.359 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 54/110 2026-03-31T19:03:37.362 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 55/110 2026-03-31T19:03:37.381 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-immutable-object-cache-2:20.2.0-721.g5bb327 56/110 2026-03-31T19:03:37.381 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-31T19:03:37.381 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-31T19:03:37.381 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:37.381 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-immutable-object-cache-2:20.2.0-721.g5bb327 56/110 2026-03-31T19:03:37.388 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-immutable-object-cache-2:20.2.0-721.g5bb327 56/110 2026-03-31T19:03:37.389 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 57/110 2026-03-31T19:03:37.391 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 58/110 2026-03-31T19:03:37.394 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-ply-3.11-14.el9.noarch 59/110 2026-03-31T19:03:37.396 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 60/110 2026-03-31T19:03:37.401 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 61/110 2026-03-31T19:03:37.405 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 62/110 2026-03-31T19:03:37.410 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 63/110 2026-03-31T19:03:37.417 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 64/110 2026-03-31T19:03:37.422 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 65/110 2026-03-31T19:03:37.425 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 66/110 2026-03-31T19:03:37.427 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 67/110 2026-03-31T19:03:37.429 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 68/110 2026-03-31T19:03:37.430 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 69/110 2026-03-31T19:03:37.433 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 70/110 2026-03-31T19:03:37.435 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 71/110 2026-03-31T19:03:37.437 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 72/110 2026-03-31T19:03:37.440 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pyparsing-2.4.7-9.el9.noarch 73/110 2026-03-31T19:03:37.447 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 74/110 2026-03-31T19:03:37.450 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 75/110 2026-03-31T19:03:37.452 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 76/110 2026-03-31T19:03:37.455 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 77/110 2026-03-31T19:03:37.457 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-isodate-0.6.1-3.el9.noarch 78/110 2026-03-31T19:03:37.459 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 79/110 2026-03-31T19:03:37.464 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : grpc-data-1.46.7-10.el9.noarch 80/110 2026-03-31T19:03:37.467 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-protobuf-3.14.0-17.el9.noarch 81/110 2026-03-31T19:03:37.470 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 82/110 2026-03-31T19:03:37.472 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 83/110 2026-03-31T19:03:37.474 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-grafana-dashboards-2:20.2.0-721.g5bb32787.e 84/110 2026-03-31T19:03:37.475 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-prometheus-alerts-2:20.2.0-721.g5bb32787.el 85/110 2026-03-31T19:03:37.492 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-base-2:20.2.0-721.g5bb32787.el9.x86_64 86/110 2026-03-31T19:03:37.492 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph.target". 2026-03-31T19:03:37.492 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-31T19:03:37.492 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:37.499 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-base-2:20.2.0-721.g5bb32787.el9.x86_64 86/110 2026-03-31T19:03:37.523 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-base-2:20.2.0-721.g5bb32787.el9.x86_64 86/110 2026-03-31T19:03:37.523 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-common-2:20.2.0-721.g5bb32787.el9.x86_64 87/110 2026-03-31T19:03:37.533 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-common-2:20.2.0-721.g5bb32787.el9.x86_64 87/110 2026-03-31T19:03:37.538 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : qatzip-libs-1.3.1-1.el9.x86_64 88/110 2026-03-31T19:03:37.540 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-ceph-common-2:20.2.0-721.g5bb32787.el9.x 89/110 2026-03-31T19:03:37.542 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 90/110 2026-03-31T19:03:37.542 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-selinux-2:20.2.0-721.g5bb32787.el9.x86_64 91/110 2026-03-31T19:03:42.145 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-selinux-2:20.2.0-721.g5bb32787.el9.x86_64 91/110 2026-03-31T19:03:42.145 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /sys 2026-03-31T19:03:42.145 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /proc 2026-03-31T19:03:42.145 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /mnt 2026-03-31T19:03:42.145 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /var/tmp 2026-03-31T19:03:42.145 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /home 2026-03-31T19:03:42.145 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /root 2026-03-31T19:03:42.145 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /tmp 2026-03-31T19:03:42.145 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:42.153 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : qatlib-25.08.0-2.el9.x86_64 92/110 2026-03-31T19:03:42.167 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: qatlib-service-25.08.0-2.el9.x86_64 93/110 2026-03-31T19:03:42.167 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : qatlib-service-25.08.0-2.el9.x86_64 93/110 2026-03-31T19:03:42.173 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: qatlib-service-25.08.0-2.el9.x86_64 93/110 2026-03-31T19:03:42.175 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 94/110 2026-03-31T19:03:42.178 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 95/110 2026-03-31T19:03:42.180 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : pciutils-3.7.0-7.el9.x86_64 96/110 2026-03-31T19:03:42.181 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 97/110 2026-03-31T19:03:42.181 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libradosstriper1-2:20.2.0-721.g5bb32787.el9.x86_ 98/110 2026-03-31T19:03:42.207 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libradosstriper1-2:20.2.0-721.g5bb32787.el9.x86_ 98/110 2026-03-31T19:03:42.211 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : nvme-cli-2.16-1.el9.x86_64 99/110 2026-03-31T19:03:42.221 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: smartmontools-1:7.2-10.el9.x86_64 100/110 2026-03-31T19:03:42.221 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/smartd.service". 2026-03-31T19:03:42.221 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:42.223 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : smartmontools-1:7.2-10.el9.x86_64 100/110 2026-03-31T19:03:42.229 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: smartmontools-1:7.2-10.el9.x86_64 100/110 2026-03-31T19:03:42.231 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 101/110 2026-03-31T19:03:42.233 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 102/110 2026-03-31T19:03:42.235 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : protobuf-3.14.0-17.el9.x86_64 103/110 2026-03-31T19:03:42.237 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 104/110 2026-03-31T19:03:42.240 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 105/110 2026-03-31T19:03:42.245 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 106/110 2026-03-31T19:03:42.251 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : cryptsetup-2.8.1-3.el9.x86_64 107/110 2026-03-31T19:03:42.255 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : abseil-cpp-20211102.0-4.el9.x86_64 108/110 2026-03-31T19:03:42.257 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-msgpack-1.0.3-2.el9.x86_64 109/110 2026-03-31T19:03:42.257 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libcephsqlite-2:20.2.0-721.g5bb32787.el9.x86_64 110/110 2026-03-31T19:03:42.341 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libcephsqlite-2:20.2.0-721.g5bb32787.el9.x86_64 110/110 2026-03-31T19:03:42.341 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : abseil-cpp-20211102.0-4.el9.x86_64 1/110 2026-03-31T19:03:42.341 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-base-2:20.2.0-721.g5bb32787.el9.x86_64 2/110 2026-03-31T19:03:42.341 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-common-2:20.2.0-721.g5bb32787.el9.x86_64 3/110 2026-03-31T19:03:42.341 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-grafana-dashboards-2:20.2.0-721.g5bb32787.e 4/110 2026-03-31T19:03:42.341 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-immutable-object-cache-2:20.2.0-721.g5bb327 5/110 2026-03-31T19:03:42.341 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-2:20.2.0-721.g5bb32787.el9.x86_64 6/110 2026-03-31T19:03:42.341 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-cephadm-2:20.2.0-721.g5bb32787.el9.noar 7/110 2026-03-31T19:03:42.341 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-dashboard-2:20.2.0-721.g5bb32787.el9.no 8/110 2026-03-31T19:03:42.341 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-diskprediction-local-2:20.2.0-721.g5bb3 9/110 2026-03-31T19:03:42.341 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-k8sevents-2:20.2.0-721.g5bb32787.el9.no 10/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-modules-core-2:20.2.0-721.g5bb32787.el9 11/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-rook-2:20.2.0-721.g5bb32787.el9.noarch 12/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-osd-2:20.2.0-721.g5bb32787.el9.x86_64 13/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-prometheus-alerts-2:20.2.0-721.g5bb32787.el 14/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-selinux-2:20.2.0-721.g5bb32787.el9.x86_64 15/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-volume-2:20.2.0-721.g5bb32787.el9.noarch 16/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : cryptsetup-2.8.1-3.el9.x86_64 17/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 18/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 19/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 20/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 21/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : grpc-data-1.46.7-10.el9.noarch 22/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 23/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephsqlite-2:20.2.0-721.g5bb32787.el9.x86_64 24/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 25/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 26/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 27/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 28/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libradosstriper1-2:20.2.0-721.g5bb32787.el9.x86_ 29/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 30/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 31/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 32/110 2026-03-31T19:03:42.342 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : nvme-cli-2.16-1.el9.x86_64 33/110 2026-03-31T19:03:42.343 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 34/110 2026-03-31T19:03:42.343 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 35/110 2026-03-31T19:03:42.343 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : pciutils-3.7.0-7.el9.x86_64 36/110 2026-03-31T19:03:42.343 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : protobuf-3.14.0-17.el9.x86_64 37/110 2026-03-31T19:03:42.343 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : protobuf-compiler-3.14.0-17.el9.x86_64 38/110 2026-03-31T19:03:42.343 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 39/110 2026-03-31T19:03:42.343 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 40/110 2026-03-31T19:03:42.343 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 41/110 2026-03-31T19:03:42.343 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 42/110 2026-03-31T19:03:42.343 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 43/110 2026-03-31T19:03:42.343 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 44/110 2026-03-31T19:03:42.343 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ceph-common-2:20.2.0-721.g5bb32787.el9.x 45/110 2026-03-31T19:03:42.343 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 46/110 2026-03-31T19:03:42.343 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 47/110 2026-03-31T19:03:42.343 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 48/110 2026-03-31T19:03:42.343 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cheroot-10.0.1-5.el9.noarch 49/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cherrypy-18.10.0-5.el9.noarch 50/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 51/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 52/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 53/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-grpcio-1.46.7-10.el9.x86_64 54/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-grpcio-tools-1.46.7-10.el9.x86_64 55/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 56/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-influxdb-5.3.1-1.el9.noarch 57/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-isodate-0.6.1-3.el9.noarch 58/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 59/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 60/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 61/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 62/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 63/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 64/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 65/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 66/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 67/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 68/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 69/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-lxml-4.6.5-3.el9.x86_64 70/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 71/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 72/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-msgpack-1.0.3-2.el9.x86_64 73/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 74/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 75/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 76/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 77/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-packaging-20.9-5.el9.noarch 78/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ply-3.11-14.el9.noarch 79/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 80/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 81/110 2026-03-31T19:03:42.344 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-protobuf-3.14.0-17.el9.noarch 82/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 83/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 84/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 85/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 86/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyparsing-2.4.7-9.el9.noarch 87/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 88/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 89/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 90/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 91/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 92/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 93/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 94/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-saml-1.16.0-1.el9.noarch 95/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 96/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 97/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 98/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 99/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 100/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 101/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-xmlsec-1.3.13-1.el9.x86_64 102/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 103/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : qatlib-25.08.0-2.el9.x86_64 104/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : qatlib-service-25.08.0-2.el9.x86_64 105/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : qatzip-libs-1.3.1-1.el9.x86_64 106/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-mirror-2:20.2.0-721.g5bb32787.el9.x86_64 107/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : smartmontools-1:7.2-10.el9.x86_64 108/110 2026-03-31T19:03:42.345 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : xmlsec1-1.2.29-13.el9.x86_64 109/110 2026-03-31T19:03:42.409 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : xmlsec1-openssl-1.2.29-13.el9.x86_64 110/110 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: abseil-cpp-20211102.0-4.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: ceph-base-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: ceph-common-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: ceph-grafana-dashboards-2:20.2.0-721.g5bb32787.el9.noarch 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: ceph-immutable-object-cache-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-cephadm-2:20.2.0-721.g5bb32787.el9.noarch 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-dashboard-2:20.2.0-721.g5bb32787.el9.noarch 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-diskprediction-local-2:20.2.0-721.g5bb32787.el9.noarch 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-k8sevents-2:20.2.0-721.g5bb32787.el9.noarch 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core-2:20.2.0-721.g5bb32787.el9.noarch 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-rook-2:20.2.0-721.g5bb32787.el9.noarch 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: ceph-osd-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: ceph-prometheus-alerts-2:20.2.0-721.g5bb32787.el9.noarch 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: ceph-selinux-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: ceph-volume-2:20.2.0-721.g5bb32787.el9.noarch 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: cryptsetup-2.8.1-3.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: grpc-data-1.46.7-10.el9.noarch 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: libcephsqlite-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: libradosstriper1-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: nvme-cli-2.16-1.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: pciutils-3.7.0-7.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: protobuf-3.14.0-17.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: protobuf-compiler-3.14.0-17.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-common-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: python3-cheroot-10.0.1-5.el9.noarch 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy-18.10.0-5.el9.noarch 2026-03-31T19:03:42.410 INFO:teuthology.orchestra.run.vm05.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-grpcio-1.46.7-10.el9.x86_64 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-grpcio-tools-1.46.7-10.el9.x86_64 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-influxdb-5.3.1-1.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-isodate-0.6.1-3.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-lxml-4.6.5-3.el9.x86_64 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-msgpack-1.0.3-2.el9.x86_64 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-packaging-20.9-5.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-ply-3.11-14.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-protobuf-3.14.0-17.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyparsing-2.4.7-9.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-saml-1.16.0-1.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-xmlsec-1.3.13-1.el9.x86_64 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: qatlib-25.08.0-2.el9.x86_64 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: qatlib-service-25.08.0-2.el9.x86_64 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: qatzip-libs-1.3.1-1.el9.x86_64 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: rbd-mirror-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: smartmontools-1:7.2-10.el9.x86_64 2026-03-31T19:03:42.411 INFO:teuthology.orchestra.run.vm05.stdout: xmlsec1-1.2.29-13.el9.x86_64 2026-03-31T19:03:42.412 INFO:teuthology.orchestra.run.vm05.stdout: xmlsec1-openssl-1.2.29-13.el9.x86_64 2026-03-31T19:03:42.412 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:42.412 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:42.592 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:42.593 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:42.593 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-31T19:03:42.593 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:42.593 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-31T19:03:42.593 INFO:teuthology.orchestra.run.vm05.stdout: cephadm noarch 2:20.2.0-721.g5bb32787.el9 @ceph-noarch 1.0 M 2026-03-31T19:03:42.593 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:42.593 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-31T19:03:42.593 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:42.593 INFO:teuthology.orchestra.run.vm05.stdout:Remove 1 Package 2026-03-31T19:03:42.593 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:42.593 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 1.0 M 2026-03-31T19:03:42.593 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-31T19:03:42.594 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-31T19:03:42.594 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-31T19:03:42.596 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-31T19:03:42.596 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-31T19:03:42.610 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-31T19:03:42.610 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : cephadm-2:20.2.0-721.g5bb32787.el9.noarch 1/1 2026-03-31T19:03:42.720 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: cephadm-2:20.2.0-721.g5bb32787.el9.noarch 1/1 2026-03-31T19:03:42.754 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : cephadm-2:20.2.0-721.g5bb32787.el9.noarch 1/1 2026-03-31T19:03:42.754 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:42.754 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-31T19:03:42.754 INFO:teuthology.orchestra.run.vm05.stdout: cephadm-2:20.2.0-721.g5bb32787.el9.noarch 2026-03-31T19:03:42.754 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:42.754 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:42.938 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-immutable-object-cache 2026-03-31T19:03:42.939 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-31T19:03:42.941 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:42.942 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-31T19:03:42.942 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:43.083 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-mgr 2026-03-31T19:03:43.083 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-31T19:03:43.086 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:43.087 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-31T19:03:43.087 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:43.231 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-mgr-dashboard 2026-03-31T19:03:43.231 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-31T19:03:43.234 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:43.235 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-31T19:03:43.235 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:43.376 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-31T19:03:43.376 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-31T19:03:43.379 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:43.380 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-31T19:03:43.380 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:43.521 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-mgr-rook 2026-03-31T19:03:43.521 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-31T19:03:43.524 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:43.525 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-31T19:03:43.525 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:43.668 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-mgr-cephadm 2026-03-31T19:03:43.668 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-31T19:03:43.671 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:43.671 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-31T19:03:43.671 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:43.829 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:43.830 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:43.830 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-31T19:03:43.830 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:43.830 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-31T19:03:43.830 INFO:teuthology.orchestra.run.vm05.stdout: ceph-fuse x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 2.7 M 2026-03-31T19:03:43.830 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-31T19:03:43.830 INFO:teuthology.orchestra.run.vm05.stdout: fuse x86_64 2.9.9-17.el9 @baseos 214 k 2026-03-31T19:03:43.830 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:43.830 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-31T19:03:43.830 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:43.830 INFO:teuthology.orchestra.run.vm05.stdout:Remove 2 Packages 2026-03-31T19:03:43.830 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:43.830 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 2.9 M 2026-03-31T19:03:43.830 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-31T19:03:43.832 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-31T19:03:43.832 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-31T19:03:43.845 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-31T19:03:43.845 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-31T19:03:43.873 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-31T19:03:43.877 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-fuse-2:20.2.0-721.g5bb32787.el9.x86_64 1/2 2026-03-31T19:03:43.891 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : fuse-2.9.9-17.el9.x86_64 2/2 2026-03-31T19:03:43.950 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: fuse-2.9.9-17.el9.x86_64 2/2 2026-03-31T19:03:43.950 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-fuse-2:20.2.0-721.g5bb32787.el9.x86_64 1/2 2026-03-31T19:03:43.987 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : fuse-2.9.9-17.el9.x86_64 2/2 2026-03-31T19:03:43.987 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:43.987 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-31T19:03:43.987 INFO:teuthology.orchestra.run.vm05.stdout: ceph-fuse-2:20.2.0-721.g5bb32787.el9.x86_64 fuse-2.9.9-17.el9.x86_64 2026-03-31T19:03:43.987 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:43.987 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:44.151 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-volume 2026-03-31T19:03:44.151 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-31T19:03:44.154 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:44.155 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-31T19:03:44.155 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:44.314 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:44.315 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:44.315 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repo Size 2026-03-31T19:03:44.315 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:44.315 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-31T19:03:44.315 INFO:teuthology.orchestra.run.vm05.stdout: librados-devel x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 449 k 2026-03-31T19:03:44.315 INFO:teuthology.orchestra.run.vm05.stdout:Removing dependent packages: 2026-03-31T19:03:44.315 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-devel x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 155 k 2026-03-31T19:03:44.315 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:44.315 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-31T19:03:44.315 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:44.315 INFO:teuthology.orchestra.run.vm05.stdout:Remove 2 Packages 2026-03-31T19:03:44.315 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:44.315 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 604 k 2026-03-31T19:03:44.315 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-31T19:03:44.316 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-31T19:03:44.316 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-31T19:03:44.325 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-31T19:03:44.325 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-31T19:03:44.348 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-31T19:03:44.350 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libcephfs-devel-2:20.2.0-721.g5bb32787.el9.x86_64 1/2 2026-03-31T19:03:44.362 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librados-devel-2:20.2.0-721.g5bb32787.el9.x86_64 2/2 2026-03-31T19:03:44.419 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librados-devel-2:20.2.0-721.g5bb32787.el9.x86_64 2/2 2026-03-31T19:03:44.419 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephfs-devel-2:20.2.0-721.g5bb32787.el9.x86_64 1/2 2026-03-31T19:03:44.453 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librados-devel-2:20.2.0-721.g5bb32787.el9.x86_64 2/2 2026-03-31T19:03:44.453 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:44.453 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-31T19:03:44.453 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-devel-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:44.453 INFO:teuthology.orchestra.run.vm05.stdout: librados-devel-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:44.453 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:44.453 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:44.616 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:44.617 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:44.617 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repo Size 2026-03-31T19:03:44.617 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:44.617 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-31T19:03:44.617 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs2 x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 2.4 M 2026-03-31T19:03:44.617 INFO:teuthology.orchestra.run.vm05.stdout:Removing dependent packages: 2026-03-31T19:03:44.617 INFO:teuthology.orchestra.run.vm05.stdout: python3-cephfs x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 510 k 2026-03-31T19:03:44.617 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-31T19:03:44.617 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-daemon x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 90 k 2026-03-31T19:03:44.617 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-proxy2 x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 52 k 2026-03-31T19:03:44.617 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 187 k 2026-03-31T19:03:44.617 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:44.617 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-31T19:03:44.617 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:44.617 INFO:teuthology.orchestra.run.vm05.stdout:Remove 5 Packages 2026-03-31T19:03:44.617 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:44.617 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 3.3 M 2026-03-31T19:03:44.617 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-31T19:03:44.619 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-31T19:03:44.619 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-31T19:03:44.630 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-31T19:03:44.630 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-31T19:03:44.655 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-31T19:03:44.657 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cephfs-2:20.2.0-721.g5bb32787.el9.x86_64 1/5 2026-03-31T19:03:44.658 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-ceph-argparse-2:20.2.0-721.g5bb32787.el9.x86 2/5 2026-03-31T19:03:44.658 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libcephfs-proxy2-2:20.2.0-721.g5bb32787.el9.x86_64 3/5 2026-03-31T19:03:44.669 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libcephfs-proxy2-2:20.2.0-721.g5bb32787.el9.x86_64 3/5 2026-03-31T19:03:44.671 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libcephfs-daemon-2:20.2.0-721.g5bb32787.el9.x86_64 4/5 2026-03-31T19:03:44.671 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libcephfs2-2:20.2.0-721.g5bb32787.el9.x86_64 5/5 2026-03-31T19:03:44.726 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libcephfs2-2:20.2.0-721.g5bb32787.el9.x86_64 5/5 2026-03-31T19:03:44.726 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephfs-daemon-2:20.2.0-721.g5bb32787.el9.x86_64 1/5 2026-03-31T19:03:44.726 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephfs-proxy2-2:20.2.0-721.g5bb32787.el9.x86_64 2/5 2026-03-31T19:03:44.726 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephfs2-2:20.2.0-721.g5bb32787.el9.x86_64 3/5 2026-03-31T19:03:44.726 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ceph-argparse-2:20.2.0-721.g5bb32787.el9.x86 4/5 2026-03-31T19:03:44.764 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cephfs-2:20.2.0-721.g5bb32787.el9.x86_64 5/5 2026-03-31T19:03:44.765 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:44.765 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-31T19:03:44.765 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-daemon-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:44.765 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-proxy2-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:44.765 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs2-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:44.765 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:44.765 INFO:teuthology.orchestra.run.vm05.stdout: python3-cephfs-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:44.765 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:44.765 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:44.916 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: libcephfs-devel 2026-03-31T19:03:44.916 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-31T19:03:44.918 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:44.918 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-31T19:03:44.918 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:45.075 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout: librados2 x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 12 M 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout:Removing dependent packages: 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout: python3-rados x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 1.1 M 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout: python3-rbd x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 1.1 M 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout: python3-rgw x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 264 k 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-16.el9 @appstream 37 k 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout: rbd-fuse x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 238 k 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout: rbd-nbd x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 498 k 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout: libnbd x86_64 1.20.3-4.el9 @appstream 453 k 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout: librbd1 x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 10 M 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout: librgw2 x86_64 2:20.2.0-721.g5bb32787.el9 @ceph 28 M 2026-03-31T19:03:45.076 INFO:teuthology.orchestra.run.vm05.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-31T19:03:45.077 INFO:teuthology.orchestra.run.vm05.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-31T19:03:45.077 INFO:teuthology.orchestra.run.vm05.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-31T19:03:45.077 INFO:teuthology.orchestra.run.vm05.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-31T19:03:45.077 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:45.077 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-31T19:03:45.077 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-31T19:03:45.077 INFO:teuthology.orchestra.run.vm05.stdout:Remove 20 Packages 2026-03-31T19:03:45.077 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:45.077 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 84 M 2026-03-31T19:03:45.077 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-31T19:03:45.080 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-31T19:03:45.080 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-31T19:03:45.101 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-31T19:03:45.101 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-31T19:03:45.139 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-31T19:03:45.142 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : rbd-nbd-2:20.2.0-721.g5bb32787.el9.x86_64 1/20 2026-03-31T19:03:45.144 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : rbd-fuse-2:20.2.0-721.g5bb32787.el9.x86_64 2/20 2026-03-31T19:03:45.147 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-rgw-2:20.2.0-721.g5bb32787.el9.x86_64 3/20 2026-03-31T19:03:45.147 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librgw2-2:20.2.0-721.g5bb32787.el9.x86_64 4/20 2026-03-31T19:03:45.158 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librgw2-2:20.2.0-721.g5bb32787.el9.x86_64 4/20 2026-03-31T19:03:45.160 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/20 2026-03-31T19:03:45.162 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-rbd-2:20.2.0-721.g5bb32787.el9.x86_64 6/20 2026-03-31T19:03:45.163 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-rados-2:20.2.0-721.g5bb32787.el9.x86_64 7/20 2026-03-31T19:03:45.164 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-16.el9.x86_64 8/20 2026-03-31T19:03:45.167 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 9/20 2026-03-31T19:03:45.167 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librbd1-2:20.2.0-721.g5bb32787.el9.x86_64 10/20 2026-03-31T19:03:45.177 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librbd1-2:20.2.0-721.g5bb32787.el9.x86_64 10/20 2026-03-31T19:03:45.177 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librados2-2:20.2.0-721.g5bb32787.el9.x86_64 11/20 2026-03-31T19:03:45.178 INFO:teuthology.orchestra.run.vm05.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-31T19:03:45.178 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:45.189 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librados2-2:20.2.0-721.g5bb32787.el9.x86_64 11/20 2026-03-31T19:03:45.190 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 12/20 2026-03-31T19:03:45.193 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : re2-1:20211101-20.el9.x86_64 13/20 2026-03-31T19:03:45.197 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 14/20 2026-03-31T19:03:45.199 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 15/20 2026-03-31T19:03:45.201 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libnbd-1.20.3-4.el9.x86_64 16/20 2026-03-31T19:03:45.202 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 17/20 2026-03-31T19:03:45.204 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 18/20 2026-03-31T19:03:45.206 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 19/20 2026-03-31T19:03:45.218 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 20/20 2026-03-31T19:03:45.273 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 20/20 2026-03-31T19:03:45.273 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/20 2026-03-31T19:03:45.273 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 2/20 2026-03-31T19:03:45.273 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 3/20 2026-03-31T19:03:45.273 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libnbd-1.20.3-4.el9.x86_64 4/20 2026-03-31T19:03:45.273 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 5/20 2026-03-31T19:03:45.273 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 6/20 2026-03-31T19:03:45.273 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librados2-2:20.2.0-721.g5bb32787.el9.x86_64 7/20 2026-03-31T19:03:45.273 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librbd1-2:20.2.0-721.g5bb32787.el9.x86_64 8/20 2026-03-31T19:03:45.273 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 9/20 2026-03-31T19:03:45.273 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librgw2-2:20.2.0-721.g5bb32787.el9.x86_64 10/20 2026-03-31T19:03:45.273 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 11/20 2026-03-31T19:03:45.273 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 12/20 2026-03-31T19:03:45.273 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rados-2:20.2.0-721.g5bb32787.el9.x86_64 13/20 2026-03-31T19:03:45.273 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rbd-2:20.2.0-721.g5bb32787.el9.x86_64 14/20 2026-03-31T19:03:45.273 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rgw-2:20.2.0-721.g5bb32787.el9.x86_64 15/20 2026-03-31T19:03:45.273 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-16.el9.x86_64 16/20 2026-03-31T19:03:45.273 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-fuse-2:20.2.0-721.g5bb32787.el9.x86_64 17/20 2026-03-31T19:03:45.273 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-nbd-2:20.2.0-721.g5bb32787.el9.x86_64 18/20 2026-03-31T19:03:45.273 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : re2-1:20211101-20.el9.x86_64 19/20 2026-03-31T19:03:45.314 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 20/20 2026-03-31T19:03:45.314 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: libnbd-1.20.3-4.el9.x86_64 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: librados2-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: librbd1-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: librgw2-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: python3-rados-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: python3-rbd-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: python3-rgw-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: qemu-kvm-block-rbd-17:10.1.0-16.el9.x86_64 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: rbd-fuse-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: rbd-nbd-2:20.2.0-721.g5bb32787.el9.x86_64 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: re2-1:20211101-20.el9.x86_64 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-31T19:03:45.315 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:45.477 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: librbd1 2026-03-31T19:03:45.477 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-31T19:03:45.479 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:45.479 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-31T19:03:45.479 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:45.626 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: python3-rados 2026-03-31T19:03:45.626 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-31T19:03:45.628 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:45.628 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-31T19:03:45.628 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:45.774 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: python3-rgw 2026-03-31T19:03:45.774 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-31T19:03:45.776 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:45.776 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-31T19:03:45.776 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:45.918 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: python3-cephfs 2026-03-31T19:03:45.918 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-31T19:03:45.920 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:45.921 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-31T19:03:45.921 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:46.064 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: python3-rbd 2026-03-31T19:03:46.064 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-31T19:03:46.066 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:46.066 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-31T19:03:46.066 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:46.209 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: rbd-fuse 2026-03-31T19:03:46.209 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-31T19:03:46.211 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:46.212 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-31T19:03:46.212 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:46.359 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: rbd-mirror 2026-03-31T19:03:46.359 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-31T19:03:46.361 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:46.361 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-31T19:03:46.361 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:46.512 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: rbd-nbd 2026-03-31T19:03:46.512 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-31T19:03:46.514 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-31T19:03:46.514 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-31T19:03:46.514 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-31T19:03:46.535 DEBUG:teuthology.orchestra.run.vm05:> sudo yum clean all 2026-03-31T19:03:46.655 INFO:teuthology.orchestra.run.vm05.stdout:56 files removed 2026-03-31T19:03:46.674 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-31T19:03:46.695 DEBUG:teuthology.orchestra.run.vm05:> sudo yum clean expire-cache 2026-03-31T19:03:46.835 INFO:teuthology.orchestra.run.vm05.stdout:Cache was expired 2026-03-31T19:03:46.835 INFO:teuthology.orchestra.run.vm05.stdout:0 files removed 2026-03-31T19:03:46.852 DEBUG:teuthology.parallel:result is None 2026-03-31T19:03:46.852 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm05.local 2026-03-31T19:03:46.852 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-31T19:03:46.874 DEBUG:teuthology.orchestra.run.vm05:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-31T19:03:46.936 DEBUG:teuthology.parallel:result is None 2026-03-31T19:03:46.936 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-31T19:03:46.938 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-31T19:03:46.938 DEBUG:teuthology.orchestra.run.vm05:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-31T19:03:46.989 INFO:teuthology.orchestra.run.vm05.stderr:bash: line 1: ntpq: command not found 2026-03-31T19:03:46.993 INFO:teuthology.orchestra.run.vm05.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-31T19:03:46.993 INFO:teuthology.orchestra.run.vm05.stdout:=============================================================================== 2026-03-31T19:03:46.993 INFO:teuthology.orchestra.run.vm05.stdout:^+ nbg01.muxx.net 2 6 77 63 +222us[ +222us] +/- 16ms 2026-03-31T19:03:46.993 INFO:teuthology.orchestra.run.vm05.stdout:^+ ntp2.uni-ulm.de 2 6 77 63 +786us[ +786us] +/- 15ms 2026-03-31T19:03:46.993 INFO:teuthology.orchestra.run.vm05.stdout:^* ovh.saclay.org 2 6 77 63 +1129us[ +609us] +/- 24ms 2026-03-31T19:03:46.993 INFO:teuthology.orchestra.run.vm05.stdout:^+ ntp.kernfusion.at 2 6 77 62 -3806us[-3806us] +/- 30ms 2026-03-31T19:03:46.994 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-31T19:03:46.995 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-31T19:03:46.996 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-31T19:03:46.998 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-31T19:03:46.999 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-31T19:03:47.001 INFO:teuthology.task.internal:Duration was 289.810565 seconds 2026-03-31T19:03:47.001 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-31T19:03:47.003 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-31T19:03:47.003 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-31T19:03:47.067 INFO:teuthology.orchestra.run.vm05.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-31T19:03:47.440 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-31T19:03:47.440 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm05.local 2026-03-31T19:03:47.440 DEBUG:teuthology.orchestra.run.vm05:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-31T19:03:47.500 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-31T19:03:47.500 DEBUG:teuthology.orchestra.run.vm05:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-31T19:03:47.899 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-31T19:03:47.899 DEBUG:teuthology.orchestra.run.vm05:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-31T19:03:47.921 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-31T19:03:47.921 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-31T19:03:47.921 INFO:teuthology.orchestra.run.vm05.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-31T19:03:47.921 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-31T19:03:47.921 INFO:teuthology.orchestra.run.vm05.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-31T19:03:48.050 INFO:teuthology.orchestra.run.vm05.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 98.5% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-31T19:03:48.051 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-31T19:03:48.053 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-31T19:03:48.053 DEBUG:teuthology.orchestra.run.vm05:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-31T19:03:48.115 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-31T19:03:48.117 DEBUG:teuthology.orchestra.run.vm05:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:03:48.177 INFO:teuthology.orchestra.run.vm05.stdout:kernel.core_pattern = core 2026-03-31T19:03:48.186 DEBUG:teuthology.orchestra.run.vm05:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-31T19:03:48.241 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-31T19:03:48.241 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-31T19:03:48.243 INFO:teuthology.task.internal:Transferring archived files... 2026-03-31T19:03:48.243 DEBUG:teuthology.misc:Transferring archived files from vm05:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-31_11:18:10-rados-tentacle-none-default-vps/4338/remote/vm05 2026-03-31T19:03:48.243 DEBUG:teuthology.orchestra.run.vm05:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-31T19:03:48.307 INFO:teuthology.task.internal:Removing archive directory... 2026-03-31T19:03:48.307 DEBUG:teuthology.orchestra.run.vm05:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-31T19:03:48.360 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-31T19:03:48.363 INFO:teuthology.task.internal:Not uploading archives. 2026-03-31T19:03:48.363 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-31T19:03:48.365 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-31T19:03:48.365 DEBUG:teuthology.orchestra.run.vm05:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-31T19:03:48.415 INFO:teuthology.orchestra.run.vm05.stdout: 8532144 0 drwxr-xr-x 2 ubuntu ubuntu 6 Mar 31 19:03 /home/ubuntu/cephtest 2026-03-31T19:03:48.416 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-31T19:03:48.421 INFO:teuthology.run:Summary data: description: rados/standalone/{supported-random-distro$/{centos_latest} workloads/crush} duration: 289.8105652332306 flavor: default owner: kyr success: true 2026-03-31T19:03:48.421 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-31T19:03:48.438 INFO:teuthology.run:pass